This commit is contained in:
ishaan-jaff 2023-07-31 08:25:24 -07:00
parent 616ab82181
commit ab18c9b3cd

View file

@ -4,7 +4,7 @@ a light 100 line package to simplify calling OpenAI, Azure, Cohere, Anthropic AP
###### litellm manages: ###### litellm manages:
* Calling all LLM APIs using the OpenAI format - `completion(model, messages)` * Calling all LLM APIs using the OpenAI format - `completion(model, messages)`
* Consistent output for all LLM APIs, text responses will always be available at `['choices'][0]['message']['content']` * Consistent output for all LLM APIs, text responses will always be available at `['choices'][0]['message']['content']`
* **[Advanced]** Automatically logging your output to Sentry, Posthog, Slack [see liteLLM Client](/docs/advanced.md) * **[Advanced]** Automatically logging your output to Sentry, Posthog, Slack [see liteLLM Client](https://litellm.readthedocs.io/en/latest/advanced/)
## Quick Start ## Quick Start
Go directly to code: [Getting Started Notebook](https://colab.research.google.com/drive/1gR3pY-JzDZahzpVdbGBtrNGDBmzUNJaJ?usp=sharing) Go directly to code: [Getting Started Notebook](https://colab.research.google.com/drive/1gR3pY-JzDZahzpVdbGBtrNGDBmzUNJaJ?usp=sharing)
@ -29,7 +29,7 @@ response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call # cohere call
response = completion("command-nightly", messages) response = completion("command-nightly", messages)
``` ```
Need Help / Support : [see troubleshooting](/docs/troubleshoot.md) Need Help / Support : [see troubleshooting](https://litellm.readthedocs.io/en/latest/troubleshoot)
## Why did we build liteLLM ## Why did we build liteLLM
- **Need for simplicity**: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere - **Need for simplicity**: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere