Ishaan Jaff
d208dedb35
(ci/cd) run again
2024-05-15 17:39:21 -07:00
Ishaan Jaff
240b183d7a
ci/cd run again
2024-05-15 17:31:14 -07:00
Krrish Dholakia
1840919ebd
fix(main.py): testing fix
2024-05-15 08:23:00 -07:00
Krrish Dholakia
8117af664c
fix(huggingface_restapi.py): fix task extraction from model name
2024-05-15 07:28:19 -07:00
Ishaan Jaff
0bac40b0f2
ci/cd run again
2024-05-14 21:53:14 -07:00
Ishaan Jaff
6290de36df
(ci/cd) run again
2024-05-14 21:39:09 -07:00
Ishaan Jaff
faa58c7938
(ci/cd) run again
2024-05-14 20:45:07 -07:00
Ishaan Jaff
6d1ae5b9c4
(ci/cd) run again
2024-05-14 20:18:12 -07:00
Ishaan Jaff
ffc637969b
(ci/cd) run again
2024-05-13 21:07:12 -07:00
Ishaan Jaff
5de31e9318
(ci/cd) run again
2024-05-13 20:54:50 -07:00
Ishaan Jaff
9bde3ccd1d
(ci/cd) fixes
2024-05-13 20:49:02 -07:00
Krrish Dholakia
071a70c5fc
test: fix watsonx api error
2024-05-13 19:01:19 -07:00
Krrish Dholakia
724d880a45
test(test_completion.py): handle async watsonx call fail
2024-05-13 18:40:51 -07:00
Krrish Dholakia
d4123951d9
test: handle watsonx rate limit error
2024-05-13 18:27:39 -07:00
Krrish Dholakia
29449aa5c1
fix(utils.py): fix watsonx exception mapping
2024-05-13 18:13:13 -07:00
Krrish Dholakia
240c9550f0
fix(utils.py): handle api assistant returning 'null' role
...
Fixes https://github.com/BerriAI/litellm/issues/3621
2024-05-13 16:46:07 -07:00
Krrish Dholakia
20456968e9
fix(openai.py): creat MistralConfig with response_format mapping for mistral api
2024-05-13 13:29:58 -07:00
Krish Dholakia
1d651c6049
Merge branch 'main' into litellm_bedrock_command_r_support
2024-05-11 21:24:42 -07:00
Ishaan Jaff
2b3414c667
ci/cd run again
2024-05-11 20:34:55 -07:00
Krrish Dholakia
49ab1a1d3f
fix(bedrock_httpx.py): working async bedrock command r calls
2024-05-11 16:45:20 -07:00
Krrish Dholakia
59c8c0adff
feat(bedrock_httpx.py): working cohere command r async calls
2024-05-11 15:04:38 -07:00
Krrish Dholakia
4a3b084961
feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls
2024-05-11 13:43:08 -07:00
Krish Dholakia
d33e49411d
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
...
(fix) Fixed linting and other bugs with watsonx provider
2024-05-11 09:56:02 -07:00
Ishaan Jaff
2c4604d90f
(ci/cd) run again
2024-05-10 19:22:13 -07:00
Krish Dholakia
1aa567f3b5
Merge pull request #3571 from BerriAI/litellm_hf_classifier_support
...
Huggingface classifier support
2024-05-10 17:54:27 -07:00
Ishaan Jaff
e3848abdfe
Merge pull request #3569 from BerriAI/litellm_fix_bug_upsert_deployments
...
[Fix] Upsert deployment bug
2024-05-10 16:53:59 -07:00
Ishaan Jaff
1a8e853817
(ci/cd) run again
2024-05-10 16:19:03 -07:00
Krrish Dholakia
6a400a6200
test: fix test
2024-05-10 15:49:20 -07:00
Krrish Dholakia
d4d175030f
docs(huggingface.md): add text-classification to huggingface docs
2024-05-10 14:39:14 -07:00
Krrish Dholakia
c17f221b89
test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi
2024-05-10 14:07:01 -07:00
Krrish Dholakia
781d5888c3
docs(predibase.md): add support for predibase to docs
2024-05-10 10:58:35 -07:00
Simon Sanchez Viloria
e1372de9ee
Merge branch 'main' into feature/watsonx-integration
2024-05-10 12:09:09 +02:00
Simon Sanchez Viloria
d3d82827ed
(test) Add tests for WatsonX completion/acompletion streaming
2024-05-10 11:55:58 +02:00
Krrish Dholakia
d7189c21fd
feat(predibase.py): support async_completion + streaming (sync + async)
...
finishes up pr
2024-05-09 17:41:27 -07:00
Krrish Dholakia
186c0ec77b
feat(predibase.py): add support for predibase provider
...
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Krish Dholakia
303e0c6226
Revert "* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role"
2024-05-07 21:42:18 -07:00
Krish Dholakia
a325bf2fb8
Merge pull request #3478 from nkvch/Issue-#3474-anthropic-roles-alternation-issue
...
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
2024-05-07 21:24:47 -07:00
Ishaan Jaff
21d3407b95
fix replicate test
2024-05-07 19:48:46 -07:00
Paul Gauthier
82a4c68e60
Added deepseek completion test
2024-05-07 11:58:05 -07:00
nkvch
389530efb4
* chore(.gitignore): add 'venv' to the list of ignored files/directories
...
* fix(test_completion.py): fix import order and remove unused imports
* feat(test_completion.py): add test for empty assistant message in completion_claude_3_empty_message()
2024-05-07 12:51:30 +02:00
Krrish Dholakia
863f9c60a2
refactor: trigger new build
2024-05-06 11:46:30 -07:00
Krrish Dholakia
b014a72f7a
test(test_openai_endpoints.py): change key
2024-05-06 11:19:47 -07:00
Ishaan Jaff
4bd3967a1a
(ci/cd) run again
2024-05-06 11:04:43 -07:00
Krrish Dholakia
4b5cf26c1b
fix(utils.py): handle gemini chunk no parts error
...
Fixes https://github.com/BerriAI/litellm/issues/3468
2024-05-06 10:59:53 -07:00
Krrish Dholakia
b5f3f198f2
fix(utils.py): anthropic error handling
2024-05-06 07:25:12 -07:00
Krrish Dholakia
d83f0b02da
test: fix local tests
2024-05-06 07:14:33 -07:00
Jack Collins
07b13ff7c5
Remove unused ModelResponse import
2024-05-06 00:16:58 -07:00
Jack Collins
51c02fdadf
Add tests for ollama + ollama chat tool calls +/- stream
2024-05-06 00:13:42 -07:00
Krrish Dholakia
8d49b3a84c
fix(factory.py): support openai 'functions' messages
2024-05-04 12:33:39 -07:00
Krrish Dholakia
d9d5149aa1
fix(factory.py): support mapping openai 'tool' message to anthropic format
2024-05-04 10:14:52 -07:00