Merge pull request #903 from kumaranvpl/main

Fix wrong model mentioned in simple proxy doc page
This commit is contained in:
Ishaan Jaff 2023-11-24 09:39:06 -08:00 committed by GitHub
commit 2e140fe56a
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
2 changed files with 2 additions and 2 deletions

View file

@ -33,7 +33,7 @@ LiteLLM manages
# Usage ([**Docs**](https://docs.litellm.ai/docs/))
> [!IMPORTANT]
> LiteLLM v1.0.0 is now requires `openai>=1.0.0`. Migration guide [here](https://docs.litellm.ai/docs/migration)
> LiteLLM v1.0.0 now requires `openai>=1.0.0`. Migration guide [here](https://docs.litellm.ai/docs/migration)
<a target="_blank" href="https://colab.research.google.com/github/BerriAI/litellm/blob/main/cookbook/liteLLM_Getting_Started.ipynb">

View file

@ -791,7 +791,7 @@ Set a model alias for your deployments.
In the `config.yaml` the model_name parameter is the user-facing name to use for your deployment.
In the config below requests with `model=gpt-4` will route to `ollama/zephyr`
In the config below requests with `model=gpt-4` will route to `ollama/llama2`
```yaml
model_list: