From c63b09857cf59ccb37f6b090f841229b14141556 Mon Sep 17 00:00:00 2001 From: Ishaan Jaff Date: Wed, 8 Nov 2023 13:39:29 -0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 1546558f0..e973a3f69 100644 --- a/README.md +++ b/README.md @@ -94,7 +94,7 @@ response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content ``` ## OpenAI Proxy - ([Docs](https://docs.litellm.ai/docs/simple_proxy)) -Use LiteLLM in any OpenAI API compatible project. Call 100+ LLMs Huggingface/Bedrock/TogetherAI/etc in the OpenAI ChatCompletions & Completions format +**If you don't want to make code changes to add the litellm package to your code base**, you can use litellm proxy. Create a server to call 100+ LLMs (Huggingface/Bedrock/TogetherAI/etc) in the OpenAI ChatCompletions & Completions format ### Step 1: Start litellm proxy ```shell