forked from phoenix/litellm-mirror
Update README.md
This commit is contained in:
parent
478e6659ab
commit
4c07cf423d
1 changed files with 10 additions and 2 deletions
12
README.md
12
README.md
|
@ -51,11 +51,19 @@ Stable version
|
||||||
pip install litellm==0.1.1
|
pip install litellm==0.1.1
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Streaming Queries
|
||||||
|
liteLLM supports streaming the model response back, pass `stream=True` to get a streaming iterator in response.
|
||||||
|
```
|
||||||
|
response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
|
||||||
|
for chunk in response:
|
||||||
|
print(chunk['choices'][0]['delta'])
|
||||||
|
```
|
||||||
|
|
||||||
# hosted version
|
# hosted version
|
||||||
- [Grab time if you want access 👋](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version)
|
- [Grab time if you want access 👋](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version)
|
||||||
|
|
||||||
# why did I build this
|
# why did we build this
|
||||||
- **Need for simplicity**: My code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere
|
- **Need for simplicity**: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere
|
||||||
|
|
||||||
# Support
|
# Support
|
||||||
Contact us at ishaan@berri.ai / krrish@berri.ai
|
Contact us at ishaan@berri.ai / krrish@berri.ai
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue