Simon Sanchez Viloria
|
9fc30e8b31
|
(test) Added completion and embedding tests for watsonx provider
|
2024-04-24 12:52:29 +02:00 |
|
Krish Dholakia
|
8d2e411df6
|
Merge pull request #3124 from elisalimli/bugfix/add-missing-tool-calls-mistral-messages
Add missing tool_calls and name to messages
|
2024-04-23 17:25:12 -07:00 |
|
alisalim17
|
e404dd7569
|
test: add test forfunction calling with mistral large latest to test_streaming.py
|
2024-04-21 11:27:42 +04:00 |
|
Krrish Dholakia
|
b38c09c87f
|
fix(utils.py): fix azure streaming logic
|
2024-04-18 07:08:36 -07:00 |
|
Krrish Dholakia
|
15ae7a8314
|
fix(utils.py): fix streaming special character flushing logic
|
2024-04-17 18:03:40 -07:00 |
|
Krrish Dholakia
|
7d0086d742
|
fix(utils.py): ensure streaming output parsing only applied for hf / sagemaker models
selectively applies the <s>
</s> checking
|
2024-04-17 17:43:41 -07:00 |
|
Ishaan Jaff
|
5393930701
|
fix function calling prompt - ask llm to respond in fahrenheit
|
2024-04-16 21:09:53 -07:00 |
|
Krrish Dholakia
|
26286a54b8
|
fix(anthropic_text.py): add support for async text completion calls
|
2024-04-15 08:15:00 -07:00 |
|
Ishaan Jaff
|
a5aef6ec00
|
Merge pull request #2879 from BerriAI/litellm_async_anthropic_api
[Feat] Async Anthropic API 97.5% lower median latency
|
2024-04-07 09:56:52 -07:00 |
|
Ishaan Jaff
|
d51e853b60
|
undo adding next-asyncio
|
2024-04-06 19:28:51 -07:00 |
|
Krrish Dholakia
|
5f910c1811
|
test(test_streaming.py): fix vertex project name
|
2024-04-06 18:45:17 -07:00 |
|
Ishaan Jaff
|
f08486448c
|
fix - test streaming
|
2024-04-06 18:28:07 -07:00 |
|
Ishaan Jaff
|
2622f0351b
|
(ci/cd) run again
|
2024-04-06 18:26:52 -07:00 |
|
Ishaan Jaff
|
4512510d37
|
Revert "fix test streaming"
This reverts commit f10aba1007 .
|
2024-04-06 18:24:00 -07:00 |
|
Ishaan Jaff
|
f10aba1007
|
fix test streaming
|
2024-04-06 18:23:10 -07:00 |
|
Ishaan Jaff
|
9be6b7ec7c
|
ci/cd run again
|
2024-04-06 18:07:41 -07:00 |
|
Ishaan Jaff
|
548b2b6861
|
test - async claude streaming
|
2024-04-06 17:55:26 -07:00 |
|
Ishaan Jaff
|
83580dc00a
|
(ci/cd) run again
|
2024-04-05 20:44:51 -07:00 |
|
Ishaan Jaff
|
122dff007b
|
fix gemini test
|
2024-04-05 20:38:28 -07:00 |
|
Ishaan Jaff
|
3b2c34d587
|
ci/cd - skip 429 errors from google gemini
|
2024-04-05 13:36:29 -07:00 |
|
Krrish Dholakia
|
109cd93a39
|
fix(sagemaker.py): support model_id consistently. support dynamic args for async calls
|
2024-03-29 09:05:00 -07:00 |
|
Krrish Dholakia
|
cd53291b62
|
fix(utils.py): support bedrock mistral streaming
|
2024-03-29 07:56:10 -07:00 |
|
Krrish Dholakia
|
3a82ff2ef2
|
fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
|
2024-03-26 13:49:42 -07:00 |
|
Krrish Dholakia
|
2dd2b8a8e3
|
test(test_streaming.py): add unit testing for custom stream wrapper
|
2024-03-26 08:57:44 -07:00 |
|
Krrish Dholakia
|
4d85387b5a
|
test(test_azure_astreaming_and_function_calling): fix test to handle caching
|
2024-03-25 19:33:57 -07:00 |
|
Krrish Dholakia
|
9e1e97528d
|
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
|
2024-03-25 16:33:41 -07:00 |
|
Krrish Dholakia
|
42a7588b04
|
fix(anthropic.py): support async claude 3 tool calling + streaming
https://github.com/BerriAI/litellm/issues/2644
|
2024-03-22 19:57:01 -07:00 |
|
Krrish Dholakia
|
b6f094c5ec
|
test(test_streaming.py): fix test
|
2024-03-18 22:06:29 -07:00 |
|
ishaan-jaff
|
ebfefe61ea
|
(fix-ci-cd) skip deep infra 429 errors
|
2024-03-13 22:05:16 -07:00 |
|
Krrish Dholakia
|
e342ecd873
|
test: set verbose for test
|
2024-03-12 12:56:38 -07:00 |
|
Krrish Dholakia
|
86ed0aaba8
|
fix(anthropic.py): support streaming with function calling
|
2024-03-12 09:52:11 -07:00 |
|
Krrish Dholakia
|
8b24ddcbbd
|
fix(bedrock.py): enable claude-3 streaming
|
2024-03-09 14:02:27 -08:00 |
|
ishaan-jaff
|
eb53136448
|
(ci/cd) run again
|
2024-03-08 22:05:39 -08:00 |
|
Krrish Dholakia
|
5f225e3a88
|
test(test_streaming.py): skip flaky test
|
2024-03-04 22:33:45 -08:00 |
|
Krrish Dholakia
|
b2eef61679
|
fix(test_streaming.py): skip flaky test
|
2024-03-04 20:46:21 -08:00 |
|
Krrish Dholakia
|
478307d4cf
|
fix(bedrock.py): support anthropic messages api on bedrock (claude-3)
|
2024-03-04 17:15:47 -08:00 |
|
ishaan-jaff
|
3e8cecc03b
|
(fix) test streaming
|
2024-03-04 08:29:16 -08:00 |
|
Krish Dholakia
|
0633eecdb8
|
Merge pull request #2236 from BerriAI/litellm_faster_usage_logs_pg_db
fix(proxy_server.py): introduces a beta endpoint for admin to view global spend
|
2024-02-28 17:08:55 -08:00 |
|
Krrish Dholakia
|
4c951d20bc
|
test: removing aws tests - account suspended - pending their approval
|
2024-02-28 13:46:20 -08:00 |
|
ishaan-jaff
|
4bd15985b4
|
(test) hugging face test - their endpoint is unstable
|
2024-02-28 12:11:29 -08:00 |
|
Krrish Dholakia
|
6b14c6287b
|
test(test_streaming.py): add more logging
|
2024-02-26 21:44:28 -08:00 |
|
Krrish Dholakia
|
5b06627c09
|
fix(utils.py): fix streaming
|
2024-02-26 12:52:53 -08:00 |
|
Krrish Dholakia
|
dfb1d34e26
|
fix(utils.py): support returning caching streaming response for function calling streaming calls
|
2024-02-26 12:31:00 -08:00 |
|
Krrish Dholakia
|
97b9d570a6
|
fix(utils.py): stricter azure function calling tests
|
2024-02-23 20:55:32 -08:00 |
|
Krish Dholakia
|
dd4439b6a8
|
Merge branch 'main' into litellm_fix_azure_function_calling_streaming
|
2024-02-22 22:36:38 -08:00 |
|
Krrish Dholakia
|
ca03c57277
|
test(test_streaming.py): fix openai streaming function calling test
|
2024-02-22 21:50:14 -08:00 |
|
Krrish Dholakia
|
e07cf33cba
|
test(test_streaming.py): add exception mapping for palm timeout error
|
2024-02-22 21:36:57 -08:00 |
|
Krrish Dholakia
|
b6017115e3
|
fix(utils.py): ensure argument is always a string
|
2024-02-22 15:15:56 -08:00 |
|
Krrish Dholakia
|
f6e52ac771
|
test: handle api errors for gemini/palm testing
|
2024-02-21 21:44:08 -08:00 |
|
Krrish Dholakia
|
c9c6547ef9
|
test(test_streaming.py): handle gemini 500 error
|
2024-02-21 21:32:03 -08:00 |
|