Commit graph

254 commits

Author SHA1 Message Date
Andres Barbaro
2cce5b3959 Remove unnecessary import 2024-05-24 11:00:29 -05:00
Andres Barbaro
c893674180 Add test to validate chunks with no delta are processed correctly 2024-05-24 10:57:35 -05:00
Krrish Dholakia
143a44823a feat(databricks.py): adds databricks support - completion, async, streaming
Closes https://github.com/BerriAI/litellm/issues/2160
2024-05-23 16:29:46 -07:00
Krrish Dholakia
808bea03db test(test_streaming.py): retry if openai is inconsistent with stream options 2024-05-21 18:51:24 -07:00
Krrish Dholakia
56084d5ac1 fix(bedrock_httpx.py): support mapping for bedrock cohere command r text 2024-05-17 16:13:49 -07:00
Krrish Dholakia
13e4196e3e fix(bedrock_httpx.py): add async support for bedrock amazon, meta, mistral models 2024-05-16 22:39:25 -07:00
Krrish Dholakia
8409b39f0d fix(bedrock_httpx.py): move bedrock ai21 calls to being async 2024-05-16 22:21:30 -07:00
Krrish Dholakia
118fc4ffac fix(bedrock_httpx.py): move anthropic bedrock calls to httpx
Fixing https://github.com/BerriAI/litellm/issues/2921
2024-05-16 21:51:55 -07:00
Krrish Dholakia
e41897808d fix(replicate.py): move replicate calls to being completely async
Closes https://github.com/BerriAI/litellm/issues/3128
2024-05-16 17:24:08 -07:00
Krrish Dholakia
87a21115c5 test: handle watsonx rate limit error 2024-05-13 18:27:39 -07:00
Krish Dholakia
784ae85ba0 Merge branch 'main' into litellm_bedrock_command_r_support 2024-05-11 21:24:42 -07:00
Krrish Dholakia
df34c64464 test(test_streaming.py): fix test 2024-05-11 19:44:47 -07:00
Krrish Dholakia
68596ced04 feat(bedrock_httpx.py): working bedrock command-r sync+async streaming 2024-05-11 19:39:51 -07:00
Krish Dholakia
8ab9c861c9 Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krish Dholakia
ddf09a3193 Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
2024-05-09 22:21:16 -07:00
Krrish Dholakia
7c0ab40fd5 feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
2024-05-09 17:41:27 -07:00
Ishaan Jaff
27de9e92cc test - stream_options on OpenAI text_completion 2024-05-09 08:41:31 -07:00
Ishaan Jaff
23dc15b9ec test openai stream_options 2024-05-08 21:52:39 -07:00
mogith-pn
c6f9cb9346 Clarifai - Added streaming and async completion support 2024-05-03 14:03:38 +00:00
Krrish Dholakia
a241cdb818 fix(utils.py): fix default values 2024-05-01 22:24:37 -07:00
Krish Dholakia
b7beab2e39 Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
2024-04-27 05:48:34 -07:00
Krrish Dholakia
85a331024a test(test_streaming.py): fix test 2024-04-25 20:22:18 -07:00
Krrish Dholakia
f7d40260fb test: rename test 2024-04-25 20:07:40 -07:00
Krrish Dholakia
2c2e4319ab fix(utils.py): add more logging to identify ci/cd issue 2024-04-25 19:57:24 -07:00
Krrish Dholakia
157dd819f6 fix(utils.py): fix the response object returned when n>1 for stream=true
Fixes https://github.com/BerriAI/litellm/issues/3276
2024-04-25 13:27:29 -07:00
Krrish Dholakia
b10f03706d fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
2024-04-24 08:06:07 -07:00
Simon Sanchez Viloria
2dd79eb30e (test) Added completion and embedding tests for watsonx provider 2024-04-24 12:52:29 +02:00
Krish Dholakia
ef471480b2 Merge pull request #3124 from elisalimli/bugfix/add-missing-tool-calls-mistral-messages
Add missing tool_calls and name to messages
2024-04-23 17:25:12 -07:00
alisalim17
25e594dc44 test: add test forfunction calling with mistral large latest to test_streaming.py 2024-04-21 11:27:42 +04:00
Krrish Dholakia
64fe5b146c fix(utils.py): fix azure streaming logic 2024-04-18 07:08:36 -07:00
Krrish Dholakia
caa46ca905 fix(utils.py): fix streaming special character flushing logic 2024-04-17 18:03:40 -07:00
Krrish Dholakia
1b4462ee70 fix(utils.py): ensure streaming output parsing only applied for hf / sagemaker models
selectively applies the <s>
</s> checking
2024-04-17 17:43:41 -07:00
Ishaan Jaff
b00ad976af fix function calling prompt - ask llm to respond in fahrenheit 2024-04-16 21:09:53 -07:00
Krrish Dholakia
1cd0551a1e fix(anthropic_text.py): add support for async text completion calls 2024-04-15 08:15:00 -07:00
Ishaan Jaff
d1d3d932ca Merge pull request #2879 from BerriAI/litellm_async_anthropic_api
[Feat] Async Anthropic API 97.5% lower median latency
2024-04-07 09:56:52 -07:00
Ishaan Jaff
9b8555e482 undo adding next-asyncio 2024-04-06 19:28:51 -07:00
Krrish Dholakia
0d067fb65d test(test_streaming.py): fix vertex project name 2024-04-06 18:45:17 -07:00
Ishaan Jaff
c37383192f fix - test streaming 2024-04-06 18:28:07 -07:00
Ishaan Jaff
aca692282a (ci/cd) run again 2024-04-06 18:26:52 -07:00
Ishaan Jaff
ccab3cdfb5 Revert "fix test streaming"
This reverts commit f10aba1007.
2024-04-06 18:24:00 -07:00
Ishaan Jaff
9e92b9b8ad fix test streaming 2024-04-06 18:23:10 -07:00
Ishaan Jaff
436133b4b3 ci/cd run again 2024-04-06 18:07:41 -07:00
Ishaan Jaff
3978a17ff9 test - async claude streaming 2024-04-06 17:55:26 -07:00
Ishaan Jaff
8436b611b4 (ci/cd) run again 2024-04-05 20:44:51 -07:00
Ishaan Jaff
aa0c01dcf3 fix gemini test 2024-04-05 20:38:28 -07:00
Ishaan Jaff
54e4326a92 ci/cd - skip 429 errors from google gemini 2024-04-05 13:36:29 -07:00
Krrish Dholakia
271fe184eb fix(sagemaker.py): support model_id consistently. support dynamic args for async calls 2024-03-29 09:05:00 -07:00
Krrish Dholakia
006c20a10f fix(utils.py): support bedrock mistral streaming 2024-03-29 07:56:10 -07:00
Krrish Dholakia
05fddcb06b fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
2024-03-26 13:49:42 -07:00
Krrish Dholakia
1137264d99 test(test_streaming.py): add unit testing for custom stream wrapper 2024-03-26 08:57:44 -07:00