forked from phoenix/litellm-mirror
Merge pull request #2205 from BerriAI/litellm_docs_use_azure_ai
[Docs] use azure ai studio + mistral large
This commit is contained in:
commit
6c200b3d5e
2 changed files with 38 additions and 0 deletions
37
docs/my-website/docs/providers/azure_ai.md
Normal file
37
docs/my-website/docs/providers/azure_ai.md
Normal file
|
@ -0,0 +1,37 @@
|
||||||
|
# Azure AI Studio
|
||||||
|
|
||||||
|
## Using Mistral models deployed on Azure AI Studio
|
||||||
|
|
||||||
|
**Ensure you have the `/v1` in your api_base**
|
||||||
|
|
||||||
|
### Sample Usage
|
||||||
|
```python
|
||||||
|
from litellm import completion
|
||||||
|
import os
|
||||||
|
|
||||||
|
response = completion(
|
||||||
|
model="mistral/Mistral-large-dfgfj",
|
||||||
|
api_base="https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/v1",
|
||||||
|
api_key = "JGbKodRcTp****"
|
||||||
|
messages=[
|
||||||
|
{"role": "user", "content": "hello from litellm"}
|
||||||
|
],
|
||||||
|
)
|
||||||
|
print(response)
|
||||||
|
```
|
||||||
|
|
||||||
|
### [LiteLLM Proxy] Using Mistral Models
|
||||||
|
|
||||||
|
Set this on your litellm proxy config.yaml
|
||||||
|
|
||||||
|
**Ensure you have the `/v1` in your api_base**
|
||||||
|
```yaml
|
||||||
|
model_list:
|
||||||
|
- model_name: mistral
|
||||||
|
litellm_params:
|
||||||
|
model: mistral/Mistral-large-dfgfj
|
||||||
|
api_base: https://Mistral-large-dfgfj-serverless.eastus2.inference.ai.azure.com/v1
|
||||||
|
api_key: JGbKodRcTp****
|
||||||
|
```
|
||||||
|
|
||||||
|
|
|
@ -120,6 +120,7 @@ const sidebars = {
|
||||||
"providers/openai",
|
"providers/openai",
|
||||||
"providers/openai_compatible",
|
"providers/openai_compatible",
|
||||||
"providers/azure",
|
"providers/azure",
|
||||||
|
"providers/azure_ai",
|
||||||
"providers/huggingface",
|
"providers/huggingface",
|
||||||
"providers/ollama",
|
"providers/ollama",
|
||||||
"providers/vertex",
|
"providers/vertex",
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue