Commit graph

246 commits

Author SHA1 Message Date
Krrish Dholakia
709373b15c fix(replicate.py): move replicate calls to being completely async
Closes https://github.com/BerriAI/litellm/issues/3128
2024-05-16 17:24:08 -07:00
Krrish Dholakia
d4123951d9 test: handle watsonx rate limit error 2024-05-13 18:27:39 -07:00
Krish Dholakia
1d651c6049
Merge branch 'main' into litellm_bedrock_command_r_support 2024-05-11 21:24:42 -07:00
Krrish Dholakia
b1448cd244 test(test_streaming.py): fix test 2024-05-11 19:44:47 -07:00
Krrish Dholakia
64650c0279 feat(bedrock_httpx.py): working bedrock command-r sync+async streaming 2024-05-11 19:39:51 -07:00
Krish Dholakia
8f6ae9a059
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krish Dholakia
a671046b45
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
2024-05-09 22:21:16 -07:00
Krrish Dholakia
d7189c21fd feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
2024-05-09 17:41:27 -07:00
Ishaan Jaff
a29fcc057b test - stream_options on OpenAI text_completion 2024-05-09 08:41:31 -07:00
Ishaan Jaff
f2965660dd test openai stream_options 2024-05-08 21:52:39 -07:00
mogith-pn
723ef9963e Clarifai - Added streaming and async completion support 2024-05-03 14:03:38 +00:00
Krrish Dholakia
29aefaa7d7 fix(utils.py): fix default values 2024-05-01 22:24:37 -07:00
Krish Dholakia
2d976cfabc
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
2024-04-27 05:48:34 -07:00
Krrish Dholakia
9eb75cc159 test(test_streaming.py): fix test 2024-04-25 20:22:18 -07:00
Krrish Dholakia
5307510592 test: rename test 2024-04-25 20:07:40 -07:00
Krrish Dholakia
850b056df5 fix(utils.py): add more logging to identify ci/cd issue 2024-04-25 19:57:24 -07:00
Krrish Dholakia
6c5c7cca3d fix(utils.py): fix the response object returned when n>1 for stream=true
Fixes https://github.com/BerriAI/litellm/issues/3276
2024-04-25 13:27:29 -07:00
Krrish Dholakia
48c2c3d78a fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
2024-04-24 08:06:07 -07:00
Simon Sanchez Viloria
9fc30e8b31 (test) Added completion and embedding tests for watsonx provider 2024-04-24 12:52:29 +02:00
Krish Dholakia
8d2e411df6
Merge pull request #3124 from elisalimli/bugfix/add-missing-tool-calls-mistral-messages
Add missing tool_calls and name to messages
2024-04-23 17:25:12 -07:00
alisalim17
e404dd7569 test: add test forfunction calling with mistral large latest to test_streaming.py 2024-04-21 11:27:42 +04:00
Krrish Dholakia
b38c09c87f fix(utils.py): fix azure streaming logic 2024-04-18 07:08:36 -07:00
Krrish Dholakia
15ae7a8314 fix(utils.py): fix streaming special character flushing logic 2024-04-17 18:03:40 -07:00
Krrish Dholakia
7d0086d742 fix(utils.py): ensure streaming output parsing only applied for hf / sagemaker models
selectively applies the <s>
</s> checking
2024-04-17 17:43:41 -07:00
Ishaan Jaff
5393930701 fix function calling prompt - ask llm to respond in fahrenheit 2024-04-16 21:09:53 -07:00
Krrish Dholakia
26286a54b8 fix(anthropic_text.py): add support for async text completion calls 2024-04-15 08:15:00 -07:00
Ishaan Jaff
a5aef6ec00
Merge pull request #2879 from BerriAI/litellm_async_anthropic_api
[Feat] Async Anthropic API 97.5% lower median latency
2024-04-07 09:56:52 -07:00
Ishaan Jaff
d51e853b60 undo adding next-asyncio 2024-04-06 19:28:51 -07:00
Krrish Dholakia
5f910c1811 test(test_streaming.py): fix vertex project name 2024-04-06 18:45:17 -07:00
Ishaan Jaff
f08486448c fix - test streaming 2024-04-06 18:28:07 -07:00
Ishaan Jaff
2622f0351b (ci/cd) run again 2024-04-06 18:26:52 -07:00
Ishaan Jaff
4512510d37 Revert "fix test streaming"
This reverts commit f10aba1007.
2024-04-06 18:24:00 -07:00
Ishaan Jaff
f10aba1007 fix test streaming 2024-04-06 18:23:10 -07:00
Ishaan Jaff
9be6b7ec7c ci/cd run again 2024-04-06 18:07:41 -07:00
Ishaan Jaff
548b2b6861 test - async claude streaming 2024-04-06 17:55:26 -07:00
Ishaan Jaff
83580dc00a (ci/cd) run again 2024-04-05 20:44:51 -07:00
Ishaan Jaff
122dff007b fix gemini test 2024-04-05 20:38:28 -07:00
Ishaan Jaff
3b2c34d587 ci/cd - skip 429 errors from google gemini 2024-04-05 13:36:29 -07:00
Krrish Dholakia
109cd93a39 fix(sagemaker.py): support model_id consistently. support dynamic args for async calls 2024-03-29 09:05:00 -07:00
Krrish Dholakia
cd53291b62 fix(utils.py): support bedrock mistral streaming 2024-03-29 07:56:10 -07:00
Krrish Dholakia
3a82ff2ef2 fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
2024-03-26 13:49:42 -07:00
Krrish Dholakia
2dd2b8a8e3 test(test_streaming.py): add unit testing for custom stream wrapper 2024-03-26 08:57:44 -07:00
Krrish Dholakia
4d85387b5a test(test_azure_astreaming_and_function_calling): fix test to handle caching 2024-03-25 19:33:57 -07:00
Krrish Dholakia
9e1e97528d fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
2024-03-25 16:33:41 -07:00
Krrish Dholakia
42a7588b04 fix(anthropic.py): support async claude 3 tool calling + streaming
https://github.com/BerriAI/litellm/issues/2644
2024-03-22 19:57:01 -07:00
Krrish Dholakia
b6f094c5ec test(test_streaming.py): fix test 2024-03-18 22:06:29 -07:00
ishaan-jaff
ebfefe61ea (fix-ci-cd) skip deep infra 429 errors 2024-03-13 22:05:16 -07:00
Krrish Dholakia
e342ecd873 test: set verbose for test 2024-03-12 12:56:38 -07:00
Krrish Dholakia
86ed0aaba8 fix(anthropic.py): support streaming with function calling 2024-03-12 09:52:11 -07:00
Krrish Dholakia
8b24ddcbbd fix(bedrock.py): enable claude-3 streaming 2024-03-09 14:02:27 -08:00