forked from phoenix/litellm-mirror
docs
This commit is contained in:
parent
20e965e05e
commit
f07ce0ba59
2 changed files with 2 additions and 1 deletions
|
@ -55,7 +55,7 @@ Here's what an example response looks like
|
|||
|
||||
You can also access information like latency.
|
||||
|
||||
```
|
||||
```python
|
||||
import litellm
|
||||
os.environ["ANTHROPIC_API_KEY"] = "sk-litellm-5b46387675a944d2"
|
||||
|
||||
|
|
|
@ -4,6 +4,7 @@ LiteLLM supports all models from [Ollama](https://github.com/jmorganca/ollama)
|
|||
<a target="_blank" href="https://colab.research.google.com/github/BerriAI/litellm/blob/main/cookbook/liteLLM_Ollama.ipynb">
|
||||
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
|
||||
</a>
|
||||
|
||||
## Pre-requisites
|
||||
Ensure you have your ollama server running
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue