litellm/docs/my-website/docs/providers/mistral.md
2023-12-28 16:50:52 +05:30

2.3 KiB

Mistral AI API

https://docs.mistral.ai/api/

API Key

# env variable
os.environ['MISTRAL_API_KEY']

Sample Usage

from litellm import completion
import os

os.environ['MISTRAL_API_KEY'] = ""
response = completion(
    model="mistral/mistral-tiny", 
    messages=[
       {"role": "user", "content": "hello from litellm"}
   ],
)
print(response)

Sample Usage - Streaming

from litellm import completion
import os

os.environ['MISTRAL_API_KEY'] = ""
response = completion(
    model="mistral/mistral-tiny", 
    messages=[
       {"role": "user", "content": "hello from litellm"}
   ],
    stream=True
)

for chunk in response:
    print(chunk)

Supported Models

All models listed here https://docs.mistral.ai/platform/endpoints are supported. We actively maintain the list of models, pricing, token window, etc. here.

Model Name Function Call
mistral-tiny completion(model="mistral/mistral-tiny", messages)
mistral-small completion(model="mistral/mistral-small", messages)
mistral-medium completion(model="mistral/mistral-medium", messages)

Sample Usage - Embedding

from litellm import embedding
import os

os.environ['MISTRAL_API_KEY'] = ""
response = embedding(
    model="mistral/mistral-embed",
    input=["good morning from litellm"],
)
print(response)

Supported Models

All models listed here https://docs.mistral.ai/platform/endpoints are supported

Model Name Function Call
mistral-embed embedding(model="mistral/mistral-embed", input)