forked from phoenix/litellm-mirror
fix(utils.py): return function name for ollama_chat function calls
This commit is contained in:
parent
b4e12fb8fd
commit
0e7b30bec9
4 changed files with 79 additions and 26 deletions
|
@ -5,6 +5,12 @@ LiteLLM supports all models from [Ollama](https://github.com/jmorganca/ollama)
|
|||
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
|
||||
</a>
|
||||
|
||||
:::info
|
||||
|
||||
We recommend using [ollama_chat](#using-ollama-apichat) for better responses.
|
||||
|
||||
:::
|
||||
|
||||
## Pre-requisites
|
||||
Ensure you have your ollama server running
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue