Commit graph

789 commits

Author SHA1 Message Date
Krish Dholakia
2e434d56e3
Merge pull request #5079 from BerriAI/litellm_add_pydantic_model_support
feat(utils.py): support passing response_format as pydantic model
2024-08-07 14:43:05 -07:00
Krish Dholakia
93d048b1dc
Merge branch 'main' into litellm_anthropic_streaming_tool_call_fix 2024-08-07 14:33:30 -07:00
Ishaan Jaff
0b98959e6d gemini test skip internal server error 2024-08-07 13:12:45 -07:00
Krrish Dholakia
a2e792d32e test: update build requirements 2024-08-07 13:09:49 -07:00
Krish Dholakia
3605e873a1
Merge branch 'main' into litellm_add_pydantic_model_support 2024-08-07 13:07:46 -07:00
Krrish Dholakia
ff386f6b60 fix(utils.py): support deepseek tool calling
Fixes https://github.com/BerriAI/litellm/issues/5081
2024-08-07 11:14:05 -07:00
Krrish Dholakia
3646e3e3a4 test(test_completion.py): handle internal server error in test 2024-08-07 10:21:37 -07:00
Krrish Dholakia
2ccb5a48b7 fix(bedrock_httpx.py): handle empty arguments returned during tool calling streaming 2024-08-07 09:54:50 -07:00
Krrish Dholakia
4919cc4d25 fix(anthropic.py): handle scenario where anthropic returns invalid json string for tool call while streaming
Fixes https://github.com/BerriAI/litellm/issues/5063
2024-08-07 09:24:11 -07:00
Ishaan Jaff
3e84014a69 run ci / cd again 2024-08-06 21:35:46 -07:00
Ishaan Jaff
f579aef740 ci/cd run again 2024-08-06 21:28:22 -07:00
Krrish Dholakia
9cf3d5f568 feat(utils.py): support passing response_format as pydantic model
Related issue - https://github.com/BerriAI/litellm/issues/5074
2024-08-06 18:16:07 -07:00
Ishaan Jaff
63e853e161 ci/cd run again 2024-08-05 22:33:49 -07:00
Ishaan Jaff
797a171962 ci/cd run again 2024-08-05 21:21:01 -07:00
Ishaan Jaff
107c468d05 run ci/cd again 2024-08-05 20:04:19 -07:00
Ishaan Jaff
4538eb848f run ci/cd again 2024-08-05 16:52:45 -07:00
Ishaan Jaff
7fae2aa394 ci/cd run again 2024-08-03 18:48:10 -07:00
Krrish Dholakia
4258295a07 feat(utils.py): Add github as a provider
Closes https://github.com/BerriAI/litellm/issues/4922#issuecomment-2266564469
2024-08-03 09:11:22 -07:00
Ishaan Jaff
fc8a87efec ci/cd run again 2024-08-02 11:46:45 -07:00
Krrish Dholakia
cd073d5ad3 test: handle anthropic rate limit error 2024-08-02 08:57:09 -07:00
Ishaan Jaff
7f93fa01e9 ci/cd run again 2024-08-01 19:55:12 -07:00
Ishaan Jaff
80831ce73f ci/cd - anyscale discontinued their API endoints - skip test 2024-08-01 17:58:48 -07:00
Krrish Dholakia
83f638100e test: handle predibase api failures 2024-07-31 19:39:58 -07:00
Krrish Dholakia
09ee8c6e2d fix(utils.py): return additional kwargs from openai-like response body
Closes https://github.com/BerriAI/litellm/issues/4981
2024-07-31 15:37:03 -07:00
Krrish Dholakia
bd68714f51 fix(utils.py): map cohere timeout error 2024-07-31 15:15:18 -07:00
Krrish Dholakia
6202f9bbb0 fix(http_handler.py): correctly re-raise timeout exception 2024-07-31 14:51:28 -07:00
Ishaan Jaff
7db85fbdb7 fix predibase mock test 2024-07-31 08:16:24 -07:00
Ishaan Jaff
1b6bf48264 ci/cd run again 2024-07-30 22:55:46 -07:00
Krrish Dholakia
24395492aa test: cleanup duplicate tests + add error handling for backend api errors 2024-07-30 21:47:52 -07:00
Krrish Dholakia
14d54f7af5 test(test_completion.py): handle gemini internal server error 2024-07-30 21:02:58 -07:00
Ishaan Jaff
c551e5b47a ci/cd run again 2024-07-30 17:21:47 -07:00
Krrish Dholakia
3cd3491920 test: cleanup testing 2024-07-24 19:47:50 -07:00
Krrish Dholakia
f35af3bf1c test(test_completion.py): update azure extra headers 2024-07-24 18:42:50 -07:00
Krrish Dholakia
77ffee4e2e test(test_completion.py): add basic test to confirm azure ad token flow works as expected 2024-07-24 13:07:25 -07:00
Krrish Dholakia
d9539e518e build(docker-compose.yml): add prometheus scraper to docker compose
persists prometheus data across restarts
2024-07-24 10:09:23 -07:00
Krrish Dholakia
fb0a13c8bb fix(anthropic.py): support openai system message being a list 2024-07-23 21:45:56 -07:00
Krrish Dholakia
f64a3309d1 fix(utils.py): support raw response headers for streaming requests 2024-07-23 11:58:58 -07:00
Ishaan Jaff
f6225623e9
Merge branch 'main' into litellm_return-response_headers 2024-07-20 19:05:56 -07:00
Ishaan Jaff
2513b64ed4 ci/cd run tests again 2024-07-20 17:44:12 -07:00
Ishaan Jaff
5e4d291244 rename to _response_headers 2024-07-20 17:31:16 -07:00
Ishaan Jaff
5e52f50a82 return response headers 2024-07-20 15:26:44 -07:00
Krrish Dholakia
a27454b8e3 fix(openai.py): support completion, streaming, async_streaming 2024-07-20 15:23:42 -07:00
Ishaan Jaff
6039e0b2a7 test - response_headers 2024-07-20 15:08:54 -07:00
Krrish Dholakia
e45956d77e fix(utils.py): fix get_llm_provider to support dynamic params for openai-compatible providers 2024-07-19 19:36:31 -07:00
Ishaan Jaff
1797021d53 ci/cd run again 2024-07-19 14:05:22 -07:00
Ishaan Jaff
2e766a7b1f ci/cd run again 2024-07-19 08:25:56 -07:00
Ishaan Jaff
9440754e48 ci/cd run again 2024-07-17 20:37:10 -07:00
Ishaan Jaff
c16583464a ci/cd run again 2024-07-17 20:25:43 -07:00
Ishaan Jaff
f9592b1c06 ci/cd run again 2024-07-17 19:57:47 -07:00
Ishaan Jaff
a6a9a186ad ci/cd run again 2024-07-17 18:40:35 -07:00