This commit is contained in:
Lucas 2025-04-24 00:58:35 -07:00 committed by GitHub
commit 950d3ba584
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
2 changed files with 24 additions and 0 deletions

View file

@ -11,6 +11,12 @@ To call models hosted behind an openai proxy, make 2 changes:
2. **Do NOT** add anything additional to the base url e.g. `/v1/embedding`. LiteLLM uses the openai-client to make these calls, and that automatically adds the relevant endpoints.
::warning
API Base URLs need to include `/v1` postfix. e.g. `https://api.openai.com/v1`
::
## Usage - completion
```python

View file

@ -1,6 +1,24 @@
# Voyage AI
https://docs.voyageai.com/embeddings/
## Proxy settings
| Model Settings | Value |
| -------------- |--------------------------------------------------------------|
| Provider | OpenAI-Compatible Endpoint |
| API Base | https://api.voyageai.com/v1 |
| Model Name | e.g. voyage-3-large |
| LiteLLM Model Name | e.g. voyage/voyage-3-large |
| Custom LLM Provider | voyage (can not be set on creation, only on model edit page) |
:: info
It's important that the API Base includes the `/v1` postfix and that the LiteLLM Model Name includes the `voyage/` prefix.
::
## API Key
```python
# env variable