Commit graph

578 commits

Author SHA1 Message Date
Krish Dholakia
303e0c6226
Revert "* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role" 2024-05-07 21:42:18 -07:00
Krish Dholakia
a325bf2fb8
Merge pull request #3478 from nkvch/Issue-#3474-anthropic-roles-alternation-issue
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
2024-05-07 21:24:47 -07:00
Ishaan Jaff
21d3407b95 fix replicate test 2024-05-07 19:48:46 -07:00
Paul Gauthier
82a4c68e60 Added deepseek completion test 2024-05-07 11:58:05 -07:00
nkvch
389530efb4 * chore(.gitignore): add 'venv' to the list of ignored files/directories
* fix(test_completion.py): fix import order and remove unused imports
* feat(test_completion.py): add test for empty assistant message in completion_claude_3_empty_message()
2024-05-07 12:51:30 +02:00
Krrish Dholakia
863f9c60a2 refactor: trigger new build 2024-05-06 11:46:30 -07:00
Krrish Dholakia
b014a72f7a test(test_openai_endpoints.py): change key 2024-05-06 11:19:47 -07:00
Ishaan Jaff
4bd3967a1a (ci/cd) run again 2024-05-06 11:04:43 -07:00
Krrish Dholakia
4b5cf26c1b fix(utils.py): handle gemini chunk no parts error
Fixes https://github.com/BerriAI/litellm/issues/3468
2024-05-06 10:59:53 -07:00
Krrish Dholakia
b5f3f198f2 fix(utils.py): anthropic error handling 2024-05-06 07:25:12 -07:00
Krrish Dholakia
d83f0b02da test: fix local tests 2024-05-06 07:14:33 -07:00
Jack Collins
07b13ff7c5 Remove unused ModelResponse import 2024-05-06 00:16:58 -07:00
Jack Collins
51c02fdadf Add tests for ollama + ollama chat tool calls +/- stream 2024-05-06 00:13:42 -07:00
Krrish Dholakia
8d49b3a84c fix(factory.py): support openai 'functions' messages 2024-05-04 12:33:39 -07:00
Krrish Dholakia
d9d5149aa1 fix(factory.py): support mapping openai 'tool' message to anthropic format 2024-05-04 10:14:52 -07:00
Krrish Dholakia
33472bfd2b fix(factory.py): support 'function' openai message role for anthropic
Fixes https://github.com/BerriAI/litellm/issues/3446
2024-05-04 10:03:30 -07:00
Ishaan Jaff
3d9287602e ci/cd run again 2024-05-01 21:13:14 -07:00
alisalim17
0aa8b94ff5 test: completion with Cohere command-r-plus model 2024-04-29 18:38:12 +04:00
Krrish Dholakia
1f6c342e94 test: fix test 2024-04-28 09:45:01 -07:00
Krish Dholakia
1841b74f49
Merge branch 'main' into litellm_common_auth_params 2024-04-28 08:38:06 -07:00
Krrish Dholakia
2c67791663 test(test_completion.py): modify acompletion test to call pre-deployed watsonx endpoint 2024-04-27 11:19:00 -07:00
Krrish Dholakia
48f19cf839 feat(utils.py): unify common auth params across azure/vertex_ai/bedrock/watsonx 2024-04-27 11:06:18 -07:00
Krish Dholakia
2a006c3d39
Revert "Fix Anthropic Messages Prompt Template function to add a third condition: list of text-content dictionaries" 2024-04-27 08:57:18 -07:00
Krish Dholakia
2d976cfabc
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
2024-04-27 05:48:34 -07:00
Emir Ayar
2ecbf6663a Add test for completion with text content dictionaries 2024-04-27 12:27:12 +02:00
Krish Dholakia
69280177a3
Merge pull request #3308 from BerriAI/litellm_fix_streaming_n
fix(utils.py): fix the response object returned when n>1 for stream=true
2024-04-25 18:36:54 -07:00
Krrish Dholakia
9f5ba67f5d fix(utils.py): return logprobs as an object not dict 2024-04-25 17:55:18 -07:00
Krrish Dholakia
caf1e28ba3 test(test_completion.py): fix test 2024-04-25 14:07:07 -07:00
Krrish Dholakia
4f46b4c397 fix(factory.py): add replicate meta llama prompt templating support 2024-04-25 08:25:00 -07:00
Ishaan Jaff
74817c560e (ci/cd) run again 2024-04-24 23:23:14 -07:00
Ishaan Jaff
13e0ac64ef (fix) updating router settings 2024-04-24 23:09:25 -07:00
Ishaan Jaff
242830108c (ci/cd) run again 2024-04-24 21:09:49 -07:00
Simon Sanchez Viloria
9fc30e8b31 (test) Added completion and embedding tests for watsonx provider 2024-04-24 12:52:29 +02:00
Krish Dholakia
8d2e411df6
Merge pull request #3124 from elisalimli/bugfix/add-missing-tool-calls-mistral-messages
Add missing tool_calls and name to messages
2024-04-23 17:25:12 -07:00
alisalim17
319e006d13 test: add test for function calling with mistral large latest to test_completion.py 2024-04-21 11:27:21 +04:00
Ishaan Jaff
81e4d59357 test - gemini-1.5-pro-latest 2024-04-19 21:22:19 -07:00
Ishaan Jaff
cb053398eb (ci/cd) run again 2024-04-18 21:04:07 -07:00
Ishaan Jaff
8c830e031d (ci/cd) run again 2024-04-18 20:35:21 -07:00
Krrish Dholakia
14eb8c374b test(test_completion.py): skip local test 2024-04-17 19:14:41 -07:00
Krrish Dholakia
18e3cf8bff fix(utils.py): support azure mistral function calling 2024-04-17 19:10:26 -07:00
Ishaan Jaff
409bd5b4ab ci/cd run again 2024-04-17 08:01:39 -07:00
Ishaan Jaff
70f1dc2bb9 (ci/cd) run again 2024-04-16 21:44:11 -07:00
Ishaan Jaff
5393930701 fix function calling prompt - ask llm to respond in fahrenheit 2024-04-16 21:09:53 -07:00
Krrish Dholakia
26286a54b8 fix(anthropic_text.py): add support for async text completion calls 2024-04-15 08:15:00 -07:00
Ishaan Jaff
5856ec03c6 (ci/cd) run again 2024-04-12 20:48:26 -07:00
Krrish Dholakia
a311788f0d test(test_completion.py): handle api instability 2024-04-09 21:58:48 -07:00
Krrish Dholakia
a6b004f10b test(test_completion.py): change model 2024-04-09 21:38:17 -07:00
Krrish Dholakia
855e7ed9d2 fix(main.py): handle translating text completion openai to chat completion for async requests
also adds testing for this, to prevent future regressions
2024-04-09 16:47:49 -07:00
Ishaan Jaff
3d298fc549 (test) completion 2024-04-05 21:03:04 -07:00
Ishaan Jaff
7fc416b636 (ci/cd) run again 2024-04-05 17:26:02 -07:00