Merge pull request #2205 from BerriAI/litellm_docs_use_azure_ai

[Docs] use azure ai studio + mistral large
This commit is contained in:
Ishaan Jaff 2024-02-26 14:33:49 -08:00 committed by GitHub
commit 6c200b3d5e
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
2 changed files with 38 additions and 0 deletions

View file

@ -0,0 +1,37 @@
# Azure AI Studio
## Using Mistral models deployed on Azure AI Studio
**Ensure you have the `/v1` in your api_base**
### Sample Usage
```python
from litellm import completion
import os
response = completion(
model="mistral/Mistral-large-dfgfj",
api_base="https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/v1",
api_key = "JGbKodRcTp****"
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)
```
### [LiteLLM Proxy] Using Mistral Models
Set this on your litellm proxy config.yaml
**Ensure you have the `/v1` in your api_base**
```yaml
model_list:
- model_name: mistral
litellm_params:
model: mistral/Mistral-large-dfgfj
api_base: https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/v1
api_key: JGbKodRcTp****
```

View file

@ -120,6 +120,7 @@ const sidebars = {
"providers/openai",
"providers/openai_compatible",
"providers/azure",
"providers/azure_ai",
"providers/huggingface",
"providers/ollama",
"providers/vertex",