mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
Update README.md
This commit is contained in:
parent
9c64558c50
commit
a81c77cfdf
1 changed files with 0 additions and 7 deletions
|
@ -77,13 +77,6 @@ Stable version
|
|||
pip install litellm==0.1.424
|
||||
```
|
||||
|
||||
|
||||
# LiteLLM Client - debugging & 1-click add new LLMs
|
||||
Debugging Dashboard 👉 https://docs.litellm.ai/docs/debugging/hosted_debugging
|
||||
|
||||

|
||||
|
||||
|
||||
## Streaming
|
||||
liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response.
|
||||
Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue