Update README.md

This commit is contained in:
Ishaan Jaff 2023-08-08 16:19:34 -07:00 committed by GitHub
parent 5bff56c7dc
commit 1b81f4ad79
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -51,10 +51,15 @@ pip install litellm==0.1.345
## Streaming Queries ## Streaming Queries
liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response. liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response.
Streaming is supported for OpenAI, Azure, Anthropic models
```python ```python
response = completion(model="gpt-3.5-turbo", messages=messages, stream=True) response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
for chunk in response: for chunk in response:
print(chunk['choices'][0]['delta']) print(chunk['choices'][0]['delta'])
# claude 2
result = litellm.completion('claude-2', messages, stream=True)
for chunk in result:
print(chunk['choices'][0]['delta'])
``` ```
# hosted version # hosted version