From 4c07cf423da36845ba4374512a46d2aa25c8cfef Mon Sep 17 00:00:00 2001 From: Ishaan Jaff Date: Sat, 5 Aug 2023 15:10:49 -0700 Subject: [PATCH] Update README.md --- README.md | 12 ++++++++++-- 1 file changed, 10 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 5360ce58c..4ba33d4bf 100644 --- a/README.md +++ b/README.md @@ -51,11 +51,19 @@ Stable version pip install litellm==0.1.1 ``` +Streaming Queries +liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response. +``` +response = completion(model="gpt-3.5-turbo", messages=messages, stream=True) +for chunk in response: + print(chunk['choices'][0]['delta']) +``` + # hosted version - [Grab time if you want access 👋](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version) -# why did I build this -- **Need for simplicity**: My code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere +# why did we build this +- **Need for simplicity**: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere # Support Contact us at ishaan@berri.ai / krrish@berri.ai