diff --git a/docs/my-website/docs/embedding/supported_embedding.md b/docs/my-website/docs/embedding/supported_embedding.md index 00948a5b1..2ee71fbd2 100644 --- a/docs/my-website/docs/embedding/supported_embedding.md +++ b/docs/my-website/docs/embedding/supported_embedding.md @@ -1,6 +1,6 @@ # Embedding Models -## Usage +## Quick Start ```python from litellm import embedding import os @@ -10,12 +10,29 @@ response = embedding('text-embedding-ada-002', input=["good morning from litellm ## OpenAI Embedding Models +### Usage +```python +from litellm import embedding +import os +os.environ['OPENAI_API_KEY'] = "" +response = embedding('text-embedding-ada-002', input=["good morning from litellm"]) +``` + | Model Name | Function Call | Required OS Variables | |----------------------|---------------------------------------------|--------------------------------------| | text-embedding-ada-002 | `embedding('text-embedding-ada-002', input)` | `os.environ['OPENAI_API_KEY']` | ## Azure OpenAI Embedding Models +### API keys +This can be set as env variables or passed as **params to litellm.embedding()** +```python +import os +os.environ['AZURE_API_KEY'] = +os.environ['AZURE_API_BASE'] = +os.environ['AZURE_API_VERSION'] = +``` + ### Usage ```python from litellm import embedding @@ -29,15 +46,16 @@ response = embedding( print(response) ``` -| Model Name | Function Call | Required OS Variables | -|----------------------|---------------------------------------------|--------------------------------------| -| text-embedding-ada-002 | `embedding('embedding-model-deployment', input=input, custom_llm_provider="azure")` | `os.environ['AZURE_API_KEY']`,`os.environ['AZURE_API_BASE']`,`os.environ['AZURE_API_VERSION']` | +| Model Name | Function Call | +|----------------------|---------------------------------------------| +| text-embedding-ada-002 | `embedding(model="azure/", input=input)` | h/t to [Mikko](https://www.linkedin.com/in/mikkolehtimaki/) for this integration ## Cohere Embedding Models https://docs.cohere.com/reference/embed +### Usage ```python from litellm import embedding import os @@ -54,6 +72,7 @@ response = embedding('embed-english-v2.0', input=["good morning from litellm"]) ## HuggingFace Embedding Models LiteLLM supports all Feature-Extraction Embedding models: https://huggingface.co/models?pipeline_tag=feature-extraction +### Usage ```python from litellm import embedding import os