Usage Guide
This page demonstrates how to use darca-embeddings in your projects.
Installation
Ensure you have Python 3.8+ installed (or whichever version is required).
Install using Poetry (or from source if you prefer), for example:
make install
This will create a virtual environment (unless in CI) and install all dependencies.
Environment Variables
For the OpenAI backend, you must set an environment variable named OPENAI_API_KEY
:
export OPENAI_API_KEY="your-openai-api-key"
Basic Usage
Import and initialize an embedding client:
from darca_embeddings import EmbeddingClient # By default, uses the OpenAI backend with model "text-embedding-ada-002" client = EmbeddingClient()
Generate a single embedding:
embedding = client.get_embedding("Hello World!") print(embedding) # a list of floating-point numbers
Generate multiple embeddings:
texts = ["Hello World!", "Another text to embed"] embeddings = client.get_embeddings(texts) for i, emb in enumerate(embeddings): print(f"Embedding for text {i}:", emb)
Backend Support
Currently, OpenAI is the only production-ready backend. You can switch backends by:
# Attempting the "huggingface" backend will raise an EmbeddingException (unimplemented).
client = EmbeddingClient(backend="huggingface")
Error Handling
The package defines custom exceptions to help you handle various error scenarios:
EmbeddingAPIKeyMissing: No valid API key found for the chosen provider.
EmbeddingResponseError: An embedding request failed or returned unexpected data.
EmbeddingException: Base class for all darca-embeddings custom exceptions.
Refer to the API Reference for a full API breakdown.