forked from phoenix/litellm-mirror
Update README.md
This commit is contained in:
parent
5bff56c7dc
commit
1b81f4ad79
1 changed files with 5 additions and 0 deletions
|
@ -51,10 +51,15 @@ pip install litellm==0.1.345
|
||||||
|
|
||||||
## Streaming Queries
|
## Streaming Queries
|
||||||
liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response.
|
liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response.
|
||||||
|
Streaming is supported for OpenAI, Azure, Anthropic models
|
||||||
```python
|
```python
|
||||||
response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
|
response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
|
||||||
for chunk in response:
|
for chunk in response:
|
||||||
print(chunk['choices'][0]['delta'])
|
print(chunk['choices'][0]['delta'])
|
||||||
|
# claude 2
|
||||||
|
result = litellm.completion('claude-2', messages, stream=True)
|
||||||
|
for chunk in result:
|
||||||
|
print(chunk['choices'][0]['delta'])
|
||||||
```
|
```
|
||||||
|
|
||||||
# hosted version
|
# hosted version
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue