Krrish Dholakia
|
e41897808d
|
fix(replicate.py): move replicate calls to being completely async
Closes https://github.com/BerriAI/litellm/issues/3128
|
2024-05-16 17:24:08 -07:00 |
|
Krrish Dholakia
|
87a21115c5
|
test: handle watsonx rate limit error
|
2024-05-13 18:27:39 -07:00 |
|
Krish Dholakia
|
784ae85ba0
|
Merge branch 'main' into litellm_bedrock_command_r_support
|
2024-05-11 21:24:42 -07:00 |
|
Krrish Dholakia
|
df34c64464
|
test(test_streaming.py): fix test
|
2024-05-11 19:44:47 -07:00 |
|
Krrish Dholakia
|
68596ced04
|
feat(bedrock_httpx.py): working bedrock command-r sync+async streaming
|
2024-05-11 19:39:51 -07:00 |
|
Krish Dholakia
|
8ab9c861c9
|
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
|
2024-05-11 09:31:46 -07:00 |
|
Krish Dholakia
|
ddf09a3193
|
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
|
2024-05-09 22:21:16 -07:00 |
|
Krrish Dholakia
|
7c0ab40fd5
|
feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
|
2024-05-09 17:41:27 -07:00 |
|
Ishaan Jaff
|
27de9e92cc
|
test - stream_options on OpenAI text_completion
|
2024-05-09 08:41:31 -07:00 |
|
Ishaan Jaff
|
23dc15b9ec
|
test openai stream_options
|
2024-05-08 21:52:39 -07:00 |
|
mogith-pn
|
c6f9cb9346
|
Clarifai - Added streaming and async completion support
|
2024-05-03 14:03:38 +00:00 |
|
Krrish Dholakia
|
a241cdb818
|
fix(utils.py): fix default values
|
2024-05-01 22:24:37 -07:00 |
|
Krish Dholakia
|
b7beab2e39
|
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
|
2024-04-27 05:48:34 -07:00 |
|
Krrish Dholakia
|
85a331024a
|
test(test_streaming.py): fix test
|
2024-04-25 20:22:18 -07:00 |
|
Krrish Dholakia
|
f7d40260fb
|
test: rename test
|
2024-04-25 20:07:40 -07:00 |
|
Krrish Dholakia
|
2c2e4319ab
|
fix(utils.py): add more logging to identify ci/cd issue
|
2024-04-25 19:57:24 -07:00 |
|
Krrish Dholakia
|
157dd819f6
|
fix(utils.py): fix the response object returned when n>1 for stream=true
Fixes https://github.com/BerriAI/litellm/issues/3276
|
2024-04-25 13:27:29 -07:00 |
|
Krrish Dholakia
|
b10f03706d
|
fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
|
2024-04-24 08:06:07 -07:00 |
|
Simon Sanchez Viloria
|
2dd79eb30e
|
(test) Added completion and embedding tests for watsonx provider
|
2024-04-24 12:52:29 +02:00 |
|
Krish Dholakia
|
ef471480b2
|
Merge pull request #3124 from elisalimli/bugfix/add-missing-tool-calls-mistral-messages
Add missing tool_calls and name to messages
|
2024-04-23 17:25:12 -07:00 |
|
alisalim17
|
25e594dc44
|
test: add test forfunction calling with mistral large latest to test_streaming.py
|
2024-04-21 11:27:42 +04:00 |
|
Krrish Dholakia
|
64fe5b146c
|
fix(utils.py): fix azure streaming logic
|
2024-04-18 07:08:36 -07:00 |
|
Krrish Dholakia
|
caa46ca905
|
fix(utils.py): fix streaming special character flushing logic
|
2024-04-17 18:03:40 -07:00 |
|
Krrish Dholakia
|
1b4462ee70
|
fix(utils.py): ensure streaming output parsing only applied for hf / sagemaker models
selectively applies the <s>
</s> checking
|
2024-04-17 17:43:41 -07:00 |
|
Ishaan Jaff
|
b00ad976af
|
fix function calling prompt - ask llm to respond in fahrenheit
|
2024-04-16 21:09:53 -07:00 |
|
Krrish Dholakia
|
1cd0551a1e
|
fix(anthropic_text.py): add support for async text completion calls
|
2024-04-15 08:15:00 -07:00 |
|
Ishaan Jaff
|
d1d3d932ca
|
Merge pull request #2879 from BerriAI/litellm_async_anthropic_api
[Feat] Async Anthropic API 97.5% lower median latency
|
2024-04-07 09:56:52 -07:00 |
|
Ishaan Jaff
|
9b8555e482
|
undo adding next-asyncio
|
2024-04-06 19:28:51 -07:00 |
|
Krrish Dholakia
|
0d067fb65d
|
test(test_streaming.py): fix vertex project name
|
2024-04-06 18:45:17 -07:00 |
|
Ishaan Jaff
|
c37383192f
|
fix - test streaming
|
2024-04-06 18:28:07 -07:00 |
|
Ishaan Jaff
|
aca692282a
|
(ci/cd) run again
|
2024-04-06 18:26:52 -07:00 |
|
Ishaan Jaff
|
ccab3cdfb5
|
Revert "fix test streaming"
This reverts commit f10aba1007 .
|
2024-04-06 18:24:00 -07:00 |
|
Ishaan Jaff
|
9e92b9b8ad
|
fix test streaming
|
2024-04-06 18:23:10 -07:00 |
|
Ishaan Jaff
|
436133b4b3
|
ci/cd run again
|
2024-04-06 18:07:41 -07:00 |
|
Ishaan Jaff
|
3978a17ff9
|
test - async claude streaming
|
2024-04-06 17:55:26 -07:00 |
|
Ishaan Jaff
|
8436b611b4
|
(ci/cd) run again
|
2024-04-05 20:44:51 -07:00 |
|
Ishaan Jaff
|
aa0c01dcf3
|
fix gemini test
|
2024-04-05 20:38:28 -07:00 |
|
Ishaan Jaff
|
54e4326a92
|
ci/cd - skip 429 errors from google gemini
|
2024-04-05 13:36:29 -07:00 |
|
Krrish Dholakia
|
271fe184eb
|
fix(sagemaker.py): support model_id consistently. support dynamic args for async calls
|
2024-03-29 09:05:00 -07:00 |
|
Krrish Dholakia
|
006c20a10f
|
fix(utils.py): support bedrock mistral streaming
|
2024-03-29 07:56:10 -07:00 |
|
Krrish Dholakia
|
05fddcb06b
|
fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
|
2024-03-26 13:49:42 -07:00 |
|
Krrish Dholakia
|
1137264d99
|
test(test_streaming.py): add unit testing for custom stream wrapper
|
2024-03-26 08:57:44 -07:00 |
|
Krrish Dholakia
|
be055f31ee
|
test(test_azure_astreaming_and_function_calling): fix test to handle caching
|
2024-03-25 19:33:57 -07:00 |
|
Krrish Dholakia
|
26dbb76d53
|
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
|
2024-03-25 16:33:41 -07:00 |
|
Krrish Dholakia
|
16368fec98
|
fix(anthropic.py): support async claude 3 tool calling + streaming
https://github.com/BerriAI/litellm/issues/2644
|
2024-03-22 19:57:01 -07:00 |
|
Krrish Dholakia
|
697758e188
|
test(test_streaming.py): fix test
|
2024-03-18 22:06:29 -07:00 |
|
ishaan-jaff
|
960d7ae2a4
|
(fix-ci-cd) skip deep infra 429 errors
|
2024-03-13 22:05:16 -07:00 |
|
Krrish Dholakia
|
0bdaf3e05b
|
test: set verbose for test
|
2024-03-12 12:56:38 -07:00 |
|
Krrish Dholakia
|
1c6438c267
|
fix(anthropic.py): support streaming with function calling
|
2024-03-12 09:52:11 -07:00 |
|
Krrish Dholakia
|
d8cf889597
|
fix(bedrock.py): enable claude-3 streaming
|
2024-03-09 14:02:27 -08:00 |
|