From a81c77cfdfff41cf9059a4a71c176fc57118e77e Mon Sep 17 00:00:00 2001 From: Krish Dholakia Date: Sat, 26 Aug 2023 21:39:42 -0700 Subject: [PATCH] Update README.md --- README.md | 7 ------- 1 file changed, 7 deletions(-) diff --git a/README.md b/README.md index 115449dfe..64006edde 100644 --- a/README.md +++ b/README.md @@ -77,13 +77,6 @@ Stable version pip install litellm==0.1.424 ``` - -# LiteLLM Client - debugging & 1-click add new LLMs -Debugging Dashboard 👉 https://docs.litellm.ai/docs/debugging/hosted_debugging - -![pika-1692887776948-1x](https://github.com/BerriAI/litellm/assets/29436595/44f40714-abdc-4c53-9642-6ba3654209d5) - - ## Streaming liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models