This commit is contained in:
ishaan-jaff 2023-09-21 21:06:21 -07:00
parent 20e965e05e
commit f07ce0ba59
2 changed files with 2 additions and 1 deletions

View file

@ -55,7 +55,7 @@ Here's what an example response looks like
You can also access information like latency.
```
```python
import litellm
os.environ["ANTHROPIC_API_KEY"] = "sk-litellm-5b46387675a944d2"

View file

@ -4,6 +4,7 @@ LiteLLM supports all models from [Ollama](https://github.com/jmorganca/ollama)
<a target="_blank" href="https://colab.research.google.com/github/BerriAI/litellm/blob/main/cookbook/liteLLM_Ollama.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
## Pre-requisites
Ensure you have your ollama server running