Commit graph

1079 commits

Author SHA1 Message Date
Ishaan Jaff
466accd4f5
Merge pull request #3462 from ffreemt/main
Add return_exceptions to batch_completion (retry)
2024-05-24 09:19:10 -07:00
ffreemt
86d46308bf Make return-exceptions as default behavior in litellm.batch_completion 2024-05-24 11:09:11 +08:00
Krrish Dholakia
43353c28b3 feat(databricks.py): add embedding model support 2024-05-23 18:22:03 -07:00
Krrish Dholakia
d2229dcd21 feat(databricks.py): adds databricks support - completion, async, streaming
Closes https://github.com/BerriAI/litellm/issues/2160
2024-05-23 16:29:46 -07:00
Krrish Dholakia
f3d29a6b4a feat(anthropic.py): support anthropic 'tool_choice' param
Closes https://github.com/BerriAI/litellm/issues/3752
2024-05-21 17:50:44 -07:00
Ishaan Jaff
2519879e67 add ImageObject 2024-05-20 10:45:37 -07:00
Ishaan Jaff
24951d44a4 feat - working httpx requests vertex ai image gen 2024-05-20 09:51:15 -07:00
Krrish Dholakia
5d24a72b7e fix(bedrock_httpx.py): support mapping for bedrock cohere command r text 2024-05-17 16:13:49 -07:00
Krrish Dholakia
0258351c61 fix(main.py): fix async stream handling during bedrock error 2024-05-16 23:37:59 -07:00
Krrish Dholakia
92c2e2af6a fix(bedrock_httpx.py): add async support for bedrock amazon, meta, mistral models 2024-05-16 22:39:25 -07:00
Krrish Dholakia
0293f7766a fix(bedrock_httpx.py): move bedrock ai21 calls to being async 2024-05-16 22:21:30 -07:00
Krrish Dholakia
180bc46ca4 fix(bedrock_httpx.py): move anthropic bedrock calls to httpx
Fixing https://github.com/BerriAI/litellm/issues/2921
2024-05-16 21:51:55 -07:00
Krrish Dholakia
709373b15c fix(replicate.py): move replicate calls to being completely async
Closes https://github.com/BerriAI/litellm/issues/3128
2024-05-16 17:24:08 -07:00
Ishaan Jaff
97324800ec
Merge pull request #3694 from BerriAI/litellm_allow_setting_anthropic_beta
[Feat] Support Anthropic `tools-2024-05-16` - Set Custom Anthropic Custom Headers
2024-05-16 15:48:26 -07:00
Ishaan Jaff
1fc9bcb184 feat use OpenAI extra_headers param 2024-05-16 14:38:17 -07:00
Krrish Dholakia
f43da3597d test: fix test 2024-05-15 08:51:40 -07:00
Krrish Dholakia
1840919ebd fix(main.py): testing fix 2024-05-15 08:23:00 -07:00
Edwin Jose George
81836ebe5d fix: custom_llm_provider needs to be set before setting timeout 2024-05-15 22:36:15 +09:30
Krrish Dholakia
b06f989871 refactor(main.py): trigger new build 2024-05-14 22:46:44 -07:00
Krrish Dholakia
3b5c06747d refactor(main.py): trigger new build 2024-05-14 22:17:40 -07:00
Krrish Dholakia
0262c480be refactor(main.py): trigger new build 2024-05-14 19:52:23 -07:00
Krrish Dholakia
298fd9b25c fix(main.py): ignore model_config param 2024-05-14 19:03:17 -07:00
Krrish Dholakia
724d880a45 test(test_completion.py): handle async watsonx call fail 2024-05-13 18:40:51 -07:00
Krrish Dholakia
d4123951d9 test: handle watsonx rate limit error 2024-05-13 18:27:39 -07:00
Krrish Dholakia
3694b5e7c0 refactor(main.py): trigger new build 2024-05-13 18:12:01 -07:00
Krrish Dholakia
240c9550f0 fix(utils.py): handle api assistant returning 'null' role
Fixes https://github.com/BerriAI/litellm/issues/3621
2024-05-13 16:46:07 -07:00
Krrish Dholakia
b4a8665d11 fix(utils.py): fix custom pricing when litellm model != response obj model name 2024-05-13 15:25:35 -07:00
Krrish Dholakia
61143c8b45 refactor(main.py): trigger new build 2024-05-11 22:53:09 -07:00
Krish Dholakia
1d651c6049
Merge branch 'main' into litellm_bedrock_command_r_support 2024-05-11 21:24:42 -07:00
Krrish Dholakia
64650c0279 feat(bedrock_httpx.py): working bedrock command-r sync+async streaming 2024-05-11 19:39:51 -07:00
Krrish Dholakia
59c8c0adff feat(bedrock_httpx.py): working cohere command r async calls 2024-05-11 15:04:38 -07:00
Krrish Dholakia
4a3b084961 feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls 2024-05-11 13:43:08 -07:00
Krish Dholakia
8f6ae9a059
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krrish Dholakia
cd1a470b9a refactor(main.py): trigger new build 2024-05-10 20:17:39 -07:00
Ishaan Jaff
d3550379b0 feat - triton embeddings 2024-05-10 18:57:06 -07:00
Krrish Dholakia
cdec7a414f test(test_router_fallbacks.py): fix test 2024-05-10 09:58:40 -07:00
Krrish Dholakia
9a31f3d3d9 fix(main.py): support env var 'VERTEX_PROJECT' and 'VERTEX_LOCATION' 2024-05-10 07:57:56 -07:00
Krish Dholakia
a671046b45
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
2024-05-09 22:21:16 -07:00
Krrish Dholakia
425efc60f4 fix(main.py): fix linting error 2024-05-09 18:12:28 -07:00
Ishaan Jaff
5eb12e30cc
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
[Feat] support `stream_options` on `litellm.text_completion`
2024-05-09 18:05:58 -07:00
Krrish Dholakia
d7189c21fd feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
2024-05-09 17:41:27 -07:00
Krrish Dholakia
186c0ec77b feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Krrish Dholakia
5c6a382d3b refactor(main.py): trigger new build 2024-05-09 15:41:33 -07:00
Ishaan Jaff
4d5b4a5293 add stream_options to text_completion 2024-05-09 08:35:35 -07:00
Ishaan Jaff
0b1885ca99
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
2024-05-09 08:34:08 -07:00
Ishaan Jaff
dfd6361310 fix completion vs acompletion params 2024-05-09 07:59:37 -07:00
Krish Dholakia
8015bc1c47
Revert "Add support for async streaming to watsonx provider " 2024-05-09 07:44:15 -07:00
Krish Dholakia
8ad979cdfe
Merge branch 'main' into litellm_region_based_routing 2024-05-08 22:19:51 -07:00
Krish Dholakia
3f13251241
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Krrish Dholakia
3d18897d69 feat(router.py): enable filtering model group by 'allowed_model_region' 2024-05-08 22:10:17 -07:00