Commit graph

1070 commits

Author SHA1 Message Date
Krrish Dholakia
49ab1a1d3f fix(bedrock_httpx.py): working async bedrock command r calls 2024-05-11 16:45:20 -07:00
Krrish Dholakia
59c8c0adff feat(bedrock_httpx.py): working cohere command r async calls 2024-05-11 15:04:38 -07:00
Krrish Dholakia
4a3b084961 feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls 2024-05-11 13:43:08 -07:00
Krish Dholakia
86d0c0ae4e
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
feat(router.py): allow setting model_region in litellm_params
2024-05-11 11:36:22 -07:00
Krrish Dholakia
6714854bb7 feat(router.py): support region routing for bedrock, vertex ai, watsonx 2024-05-11 11:04:00 -07:00
Krish Dholakia
d33e49411d
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
(fix) Fixed linting and other bugs with watsonx provider
2024-05-11 09:56:02 -07:00
Krish Dholakia
8f6ae9a059
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krish Dholakia
40063798bd
Merge pull request #3507 from Manouchehri/oidc-3505-part-1
Initial OIDC support (Google/GitHub/CircleCI -> Amazon Bedrock & Azure OpenAI)
2024-05-11 09:25:17 -07:00
Krrish Dholakia
c12af219af fix: fix linting 2024-05-10 20:23:33 -07:00
Krrish Dholakia
94f3d361b0 fix(vertex_ai.py): fix list tool call responses
Closes https://github.com/BerriAI/litellm/issues/3147
2024-05-10 20:05:58 -07:00
Ishaan Jaff
b09075da53
Merge pull request #3577 from BerriAI/litellm_add_triton_server
[Feat] Add Triton Embeddings to LiteLLM
2024-05-10 19:20:23 -07:00
Ishaan Jaff
d3550379b0 feat - triton embeddings 2024-05-10 18:57:06 -07:00
Krrish Dholakia
d4d175030f docs(huggingface.md): add text-classification to huggingface docs 2024-05-10 14:39:14 -07:00
Krrish Dholakia
c17f221b89 test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi 2024-05-10 14:07:01 -07:00
Rajan Paneru
65b07bcb8c Preserving the Pydantic Message Object
Following statement replaces the Pydantic Message Object and initialize it with the dict
model_response["choices"][0]["message"] = response_json["message"]

We need to make sure message is always litellm.Message object

As a fix, based on the code of ollama.py file, i am updating just the content intead of entire object for both sync and async functions
2024-05-10 22:12:32 +09:30
Simon Sanchez Viloria
e1372de9ee Merge branch 'main' into feature/watsonx-integration 2024-05-10 12:09:09 +02:00
Simon Sanchez Viloria
170fd11c82 (fix) watsonx.py: Fixed linting errors and make sure stream chunk always return usage 2024-05-10 11:53:33 +02:00
Krish Dholakia
a671046b45
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
2024-05-09 22:21:16 -07:00
Krrish Dholakia
714370956f fix(predibase.py): fix async streaming 2024-05-09 22:18:16 -07:00
Krrish Dholakia
76d4290591 fix(predibase.py): fix event loop closed error 2024-05-09 19:07:19 -07:00
Krrish Dholakia
491e177348 fix(predibase.py): fix async completion call 2024-05-09 18:44:19 -07:00
Ishaan Jaff
5eb12e30cc
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
[Feat] support `stream_options` on `litellm.text_completion`
2024-05-09 18:05:58 -07:00
Krrish Dholakia
9083d8e490 fix: fix linting errors 2024-05-09 17:55:27 -07:00
Krrish Dholakia
d7189c21fd feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
2024-05-09 17:41:27 -07:00
Krrish Dholakia
186c0ec77b feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
frob
c44970c813
Merge branch 'BerriAI:main' into ollama-image-handling 2024-05-09 20:25:30 +02:00
Ishaan Jaff
66053f14ae stream_options for text-completionopenai 2024-05-09 08:37:40 -07:00
Ishaan Jaff
0b1885ca99
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
2024-05-09 08:34:08 -07:00
Krish Dholakia
8015bc1c47
Revert "Add support for async streaming to watsonx provider " 2024-05-09 07:44:15 -07:00
Krish Dholakia
8ad979cdfe
Merge branch 'main' into litellm_region_based_routing 2024-05-08 22:19:51 -07:00
Krish Dholakia
3f13251241
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Krrish Dholakia
3d18897d69 feat(router.py): enable filtering model group by 'allowed_model_region' 2024-05-08 22:10:17 -07:00
Ishaan Jaff
1042051602 support stream_options for chat completion models 2024-05-08 21:52:25 -07:00
Ishaan Jaff
6d955ef457
Merge branch 'main' into ui_show_spend_end_user 2024-05-08 18:29:25 -07:00
Ishaan Jaff
faab704d28 update global/spend/end_users 2024-05-08 17:03:09 -07:00
Krrish Dholakia
6575143460 feat(proxy_server.py): return litellm version in response headers 2024-05-08 16:00:08 -07:00
frob
b93c00abec
Merge branch 'BerriAI:main' into ollama-image-handling 2024-05-09 00:14:29 +02:00
Krish Dholakia
303e0c6226
Revert "* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role" 2024-05-07 21:42:18 -07:00
Krish Dholakia
a325bf2fb8
Merge pull request #3478 from nkvch/Issue-#3474-anthropic-roles-alternation-issue
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
2024-05-07 21:24:47 -07:00
Ishaan Jaff
2725a55e7a
Merge pull request #3470 from mbektas/fix-ollama-embeddings
support sync ollama embeddings
2024-05-07 19:21:37 -07:00
David Manouchehri
cb49fb004d
fix(azure.py): Correct invalid .get to a .post for OIDC 2024-05-07 20:01:46 +00:00
David Manouchehri
e268354acc
feat(azure.py): Support OIDC auth 2024-05-07 20:01:33 +00:00
David Manouchehri
3ee0328b04
feat(bedrock.py): Support using OIDC tokens. 2024-05-07 15:46:54 +00:00
nkvch
06d1aec49e * fix(factory.py): fix conditional statement for merging messages with previous ones 2024-05-07 13:42:42 +02:00
nkvch
7d7b59ff78 * feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role 2024-05-07 12:51:30 +02:00
Krrish Dholakia
c6d830d377 fix(anthropic.py): remove raise error on 'empty content'
Fixes https://github.com/BerriAI/litellm/issues/3453
2024-05-06 11:42:09 -07:00
frob
7a1a3f6411
Merge branch 'BerriAI:main' into ollama-image-handling 2024-05-06 18:06:45 +02:00
Simon Sanchez Viloria
6181d1eaad Merge branch 'main' into feature/watsonx-integration 2024-05-06 17:27:14 +02:00
Simon Sanchez Viloria
83a274b54b (feat) support for async stream to watsonx provider 2024-05-06 17:08:40 +02:00
Simon Sanchez Viloria
62b3f25398 (fix) fix ibm_granite_pt and use hf_chat_template for llama-3 instruct prompts 2024-05-06 17:08:40 +02:00