Commit graph

649 commits

Author SHA1 Message Date
Krrish Dholakia
29449aa5c1 fix(utils.py): fix watsonx exception mapping 2024-05-13 18:13:13 -07:00
Krrish Dholakia
240c9550f0 fix(utils.py): handle api assistant returning 'null' role
Fixes https://github.com/BerriAI/litellm/issues/3621
2024-05-13 16:46:07 -07:00
Krrish Dholakia
20456968e9 fix(openai.py): creat MistralConfig with response_format mapping for mistral api 2024-05-13 13:29:58 -07:00
Krish Dholakia
1d651c6049
Merge branch 'main' into litellm_bedrock_command_r_support 2024-05-11 21:24:42 -07:00
Ishaan Jaff
2b3414c667 ci/cd run again 2024-05-11 20:34:55 -07:00
Krrish Dholakia
49ab1a1d3f fix(bedrock_httpx.py): working async bedrock command r calls 2024-05-11 16:45:20 -07:00
Krrish Dholakia
59c8c0adff feat(bedrock_httpx.py): working cohere command r async calls 2024-05-11 15:04:38 -07:00
Krrish Dholakia
4a3b084961 feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls 2024-05-11 13:43:08 -07:00
Krish Dholakia
d33e49411d
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
(fix) Fixed linting and other bugs with watsonx provider
2024-05-11 09:56:02 -07:00
Ishaan Jaff
2c4604d90f (ci/cd) run again 2024-05-10 19:22:13 -07:00
Krish Dholakia
1aa567f3b5
Merge pull request #3571 from BerriAI/litellm_hf_classifier_support
Huggingface classifier support
2024-05-10 17:54:27 -07:00
Ishaan Jaff
e3848abdfe
Merge pull request #3569 from BerriAI/litellm_fix_bug_upsert_deployments
[Fix] Upsert deployment bug
2024-05-10 16:53:59 -07:00
Ishaan Jaff
1a8e853817 (ci/cd) run again 2024-05-10 16:19:03 -07:00
Krrish Dholakia
6a400a6200 test: fix test 2024-05-10 15:49:20 -07:00
Krrish Dholakia
d4d175030f docs(huggingface.md): add text-classification to huggingface docs 2024-05-10 14:39:14 -07:00
Krrish Dholakia
c17f221b89 test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi 2024-05-10 14:07:01 -07:00
Krrish Dholakia
781d5888c3 docs(predibase.md): add support for predibase to docs 2024-05-10 10:58:35 -07:00
Simon Sanchez Viloria
e1372de9ee Merge branch 'main' into feature/watsonx-integration 2024-05-10 12:09:09 +02:00
Simon Sanchez Viloria
d3d82827ed (test) Add tests for WatsonX completion/acompletion streaming 2024-05-10 11:55:58 +02:00
Krrish Dholakia
d7189c21fd feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
2024-05-09 17:41:27 -07:00
Krrish Dholakia
186c0ec77b feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Krish Dholakia
303e0c6226
Revert "* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role" 2024-05-07 21:42:18 -07:00
Krish Dholakia
a325bf2fb8
Merge pull request #3478 from nkvch/Issue-#3474-anthropic-roles-alternation-issue
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
2024-05-07 21:24:47 -07:00
Ishaan Jaff
21d3407b95 fix replicate test 2024-05-07 19:48:46 -07:00
Paul Gauthier
82a4c68e60 Added deepseek completion test 2024-05-07 11:58:05 -07:00
nkvch
389530efb4 * chore(.gitignore): add 'venv' to the list of ignored files/directories
* fix(test_completion.py): fix import order and remove unused imports
* feat(test_completion.py): add test for empty assistant message in completion_claude_3_empty_message()
2024-05-07 12:51:30 +02:00
Krrish Dholakia
863f9c60a2 refactor: trigger new build 2024-05-06 11:46:30 -07:00
Krrish Dholakia
b014a72f7a test(test_openai_endpoints.py): change key 2024-05-06 11:19:47 -07:00
Ishaan Jaff
4bd3967a1a (ci/cd) run again 2024-05-06 11:04:43 -07:00
Krrish Dholakia
4b5cf26c1b fix(utils.py): handle gemini chunk no parts error
Fixes https://github.com/BerriAI/litellm/issues/3468
2024-05-06 10:59:53 -07:00
Krrish Dholakia
b5f3f198f2 fix(utils.py): anthropic error handling 2024-05-06 07:25:12 -07:00
Krrish Dholakia
d83f0b02da test: fix local tests 2024-05-06 07:14:33 -07:00
Jack Collins
07b13ff7c5 Remove unused ModelResponse import 2024-05-06 00:16:58 -07:00
Jack Collins
51c02fdadf Add tests for ollama + ollama chat tool calls +/- stream 2024-05-06 00:13:42 -07:00
Krrish Dholakia
8d49b3a84c fix(factory.py): support openai 'functions' messages 2024-05-04 12:33:39 -07:00
Krrish Dholakia
d9d5149aa1 fix(factory.py): support mapping openai 'tool' message to anthropic format 2024-05-04 10:14:52 -07:00
Krrish Dholakia
33472bfd2b fix(factory.py): support 'function' openai message role for anthropic
Fixes https://github.com/BerriAI/litellm/issues/3446
2024-05-04 10:03:30 -07:00
Ishaan Jaff
3d9287602e ci/cd run again 2024-05-01 21:13:14 -07:00
alisalim17
0aa8b94ff5 test: completion with Cohere command-r-plus model 2024-04-29 18:38:12 +04:00
Krrish Dholakia
1f6c342e94 test: fix test 2024-04-28 09:45:01 -07:00
Krish Dholakia
1841b74f49
Merge branch 'main' into litellm_common_auth_params 2024-04-28 08:38:06 -07:00
Krrish Dholakia
2c67791663 test(test_completion.py): modify acompletion test to call pre-deployed watsonx endpoint 2024-04-27 11:19:00 -07:00
Krrish Dholakia
48f19cf839 feat(utils.py): unify common auth params across azure/vertex_ai/bedrock/watsonx 2024-04-27 11:06:18 -07:00
Krish Dholakia
2a006c3d39
Revert "Fix Anthropic Messages Prompt Template function to add a third condition: list of text-content dictionaries" 2024-04-27 08:57:18 -07:00
Krish Dholakia
2d976cfabc
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
2024-04-27 05:48:34 -07:00
Emir Ayar
2ecbf6663a Add test for completion with text content dictionaries 2024-04-27 12:27:12 +02:00
Krish Dholakia
69280177a3
Merge pull request #3308 from BerriAI/litellm_fix_streaming_n
fix(utils.py): fix the response object returned when n>1 for stream=true
2024-04-25 18:36:54 -07:00
Krrish Dholakia
9f5ba67f5d fix(utils.py): return logprobs as an object not dict 2024-04-25 17:55:18 -07:00
Krrish Dholakia
caf1e28ba3 test(test_completion.py): fix test 2024-04-25 14:07:07 -07:00
Krrish Dholakia
4f46b4c397 fix(factory.py): add replicate meta llama prompt templating support 2024-04-25 08:25:00 -07:00