forked from phoenix/litellm-mirror
(feat) add Predicted Outputs
for OpenAI (#6594)
* bump openai to openai==1.54.0 * add 'prediction' param * testing fix bedrock deprecated cohere.command-text-v14 * test test_openai_prediction_param.py * test_openai_prediction_param_with_caching * doc Predicted Outputs * doc Predicted Output
This commit is contained in:
parent
57b1bb5e06
commit
c047d51cc8
12 changed files with 362 additions and 13 deletions
|
@ -1,6 +1,6 @@
|
|||
# LITELLM PROXY DEPENDENCIES #
|
||||
anyio==4.4.0 # openai + http req.
|
||||
openai==1.52.0 # openai req.
|
||||
openai==1.54.0 # openai req.
|
||||
fastapi==0.111.0 # server dep
|
||||
backoff==2.2.1 # server dep
|
||||
pyyaml==6.0.0 # server dep
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue