fix(utils.py): return function name for ollama_chat function calls

This commit is contained in:
Krrish Dholakia 2024-03-08 08:01:10 -08:00
parent b4e12fb8fd
commit 0e7b30bec9
4 changed files with 79 additions and 26 deletions

View file

@ -5,6 +5,12 @@ LiteLLM supports all models from [Ollama](https://github.com/jmorganca/ollama)
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
:::info
We recommend using [ollama_chat](#using-ollama-apichat) for better responses.
:::
## Pre-requisites
Ensure you have your ollama server running