From dffaaad6c2c07761ceb5c01beae674d45422a4b1 Mon Sep 17 00:00:00 2001 From: Krish Dholakia Date: Mon, 18 Sep 2023 17:28:46 -0700 Subject: [PATCH] Update README.md --- README.md | 5 ----- 1 file changed, 5 deletions(-) diff --git a/README.md b/README.md index effb72068..3e19739a8 100644 --- a/README.md +++ b/README.md @@ -77,11 +77,6 @@ response = completion(model="command-nightly", messages=messages) **Don't have a key? We'll give you access 👉 https://docs.litellm.ai/docs/proxy_api** -Stable version -``` -pip install litellm==0.1.424 -``` - ## Streaming liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models