forked from phoenix/litellm-mirror
Update README.md
This commit is contained in:
parent
912835bcac
commit
dffaaad6c2
1 changed files with 0 additions and 5 deletions
|
@ -77,11 +77,6 @@ response = completion(model="command-nightly", messages=messages)
|
||||||
|
|
||||||
**Don't have a key? We'll give you access 👉 https://docs.litellm.ai/docs/proxy_api**
|
**Don't have a key? We'll give you access 👉 https://docs.litellm.ai/docs/proxy_api**
|
||||||
|
|
||||||
Stable version
|
|
||||||
```
|
|
||||||
pip install litellm==0.1.424
|
|
||||||
```
|
|
||||||
|
|
||||||
## Streaming
|
## Streaming
|
||||||
liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response.
|
liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response.
|
||||||
Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models
|
Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue