(docs) ollama/json mode

This commit is contained in:
ishaan-jaff 2023-12-25 13:31:01 +05:30
parent 9f79f75635
commit 485f05ea02

View file

@ -58,6 +58,24 @@ async def async_ollama():
import asyncio
asyncio.run(async_ollama())
```
## Example Usage - JSON Mode
To use ollama JSON Mode pass `format="json"` to `litellm.completion()`
```python
from litellm import completion
response = completion(
model="ollama/llama2",
messages=[
{
"role": "user",
"content": "respond in json, what's the weather"
}
],
max_tokens=10,
format = "json"
)
```
## Ollama Models
Ollama supported models: https://github.com/jmorganca/ollama