mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
(docs) update migration
This commit is contained in:
parent
3aebc46ebf
commit
5dabcc21c9
1 changed files with 6 additions and 0 deletions
|
@ -21,6 +21,12 @@ When we have breaking changes (i.e. going from 1.x.x to 2.x.x), we will document
|
|||
max_tokens = litellm.get_max_tokens("gpt-3.5-turbo") # returns an int not a dict
|
||||
assert max_tokens==4097
|
||||
```
|
||||
- Streaming - OpenAI Chunks now return `None` for empty stream chunks. This is how to process stream chunks with content
|
||||
```python
|
||||
response = litellm.completion(model="gpt-3.5-turbo", messages=messages, stream=True)
|
||||
for part in response:
|
||||
print(part.choices[0].delta.content or "")
|
||||
```
|
||||
|
||||
**How can we communicate changes better?**
|
||||
Tell us
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue