diff --git a/docs/my-website/docs/providers/openai_compatible.md b/docs/my-website/docs/providers/openai_compatible.md index c7f9bf6f40..f7bc06ff62 100644 --- a/docs/my-website/docs/providers/openai_compatible.md +++ b/docs/my-website/docs/providers/openai_compatible.md @@ -11,6 +11,12 @@ To call models hosted behind an openai proxy, make 2 changes: 2. **Do NOT** add anything additional to the base url e.g. `/v1/embedding`. LiteLLM uses the openai-client to make these calls, and that automatically adds the relevant endpoints. +::warning + +API Base URLs need to include `/v1` postfix. e.g. `https://api.openai.com/v1` + +:: + ## Usage - completion ```python diff --git a/docs/my-website/docs/providers/voyage.md b/docs/my-website/docs/providers/voyage.md index 6ab6b1846f..1d0748139c 100644 --- a/docs/my-website/docs/providers/voyage.md +++ b/docs/my-website/docs/providers/voyage.md @@ -1,6 +1,24 @@ # Voyage AI https://docs.voyageai.com/embeddings/ +## Proxy settings + +| Model Settings | Value | +| -------------- |--------------------------------------------------------------| +| Provider | OpenAI-Compatible Endpoint | +| API Base | https://api.voyageai.com/v1 | +| Model Name | e.g. voyage-3-large | +| LiteLLM Model Name | e.g. voyage/voyage-3-large | +| Custom LLM Provider | voyage (can not be set on creation, only on model edit page) | + + +:: info + + It's important that the API Base includes the `/v1` postfix and that the LiteLLM Model Name includes the `voyage/` prefix. + +:: + + ## API Key ```python # env variable