Krrish Dholakia
|
3a82ff2ef2
|
fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
|
2024-03-26 13:49:42 -07:00 |
|
Krrish Dholakia
|
2dd2b8a8e3
|
test(test_streaming.py): add unit testing for custom stream wrapper
|
2024-03-26 08:57:44 -07:00 |
|
Krrish Dholakia
|
4d85387b5a
|
test(test_azure_astreaming_and_function_calling): fix test to handle caching
|
2024-03-25 19:33:57 -07:00 |
|
Krrish Dholakia
|
9e1e97528d
|
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
|
2024-03-25 16:33:41 -07:00 |
|
Krrish Dholakia
|
42a7588b04
|
fix(anthropic.py): support async claude 3 tool calling + streaming
https://github.com/BerriAI/litellm/issues/2644
|
2024-03-22 19:57:01 -07:00 |
|
Krrish Dholakia
|
b6f094c5ec
|
test(test_streaming.py): fix test
|
2024-03-18 22:06:29 -07:00 |
|
ishaan-jaff
|
ebfefe61ea
|
(fix-ci-cd) skip deep infra 429 errors
|
2024-03-13 22:05:16 -07:00 |
|
Krrish Dholakia
|
e342ecd873
|
test: set verbose for test
|
2024-03-12 12:56:38 -07:00 |
|
Krrish Dholakia
|
86ed0aaba8
|
fix(anthropic.py): support streaming with function calling
|
2024-03-12 09:52:11 -07:00 |
|
Krrish Dholakia
|
8b24ddcbbd
|
fix(bedrock.py): enable claude-3 streaming
|
2024-03-09 14:02:27 -08:00 |
|
ishaan-jaff
|
eb53136448
|
(ci/cd) run again
|
2024-03-08 22:05:39 -08:00 |
|
Krrish Dholakia
|
5f225e3a88
|
test(test_streaming.py): skip flaky test
|
2024-03-04 22:33:45 -08:00 |
|
Krrish Dholakia
|
b2eef61679
|
fix(test_streaming.py): skip flaky test
|
2024-03-04 20:46:21 -08:00 |
|
Krrish Dholakia
|
478307d4cf
|
fix(bedrock.py): support anthropic messages api on bedrock (claude-3)
|
2024-03-04 17:15:47 -08:00 |
|
ishaan-jaff
|
3e8cecc03b
|
(fix) test streaming
|
2024-03-04 08:29:16 -08:00 |
|
Krish Dholakia
|
0633eecdb8
|
Merge pull request #2236 from BerriAI/litellm_faster_usage_logs_pg_db
fix(proxy_server.py): introduces a beta endpoint for admin to view global spend
|
2024-02-28 17:08:55 -08:00 |
|
Krrish Dholakia
|
4c951d20bc
|
test: removing aws tests - account suspended - pending their approval
|
2024-02-28 13:46:20 -08:00 |
|
ishaan-jaff
|
4bd15985b4
|
(test) hugging face test - their endpoint is unstable
|
2024-02-28 12:11:29 -08:00 |
|
Krrish Dholakia
|
6b14c6287b
|
test(test_streaming.py): add more logging
|
2024-02-26 21:44:28 -08:00 |
|
Krrish Dholakia
|
5b06627c09
|
fix(utils.py): fix streaming
|
2024-02-26 12:52:53 -08:00 |
|
Krrish Dholakia
|
dfb1d34e26
|
fix(utils.py): support returning caching streaming response for function calling streaming calls
|
2024-02-26 12:31:00 -08:00 |
|
Krrish Dholakia
|
97b9d570a6
|
fix(utils.py): stricter azure function calling tests
|
2024-02-23 20:55:32 -08:00 |
|
Krish Dholakia
|
dd4439b6a8
|
Merge branch 'main' into litellm_fix_azure_function_calling_streaming
|
2024-02-22 22:36:38 -08:00 |
|
Krrish Dholakia
|
ca03c57277
|
test(test_streaming.py): fix openai streaming function calling test
|
2024-02-22 21:50:14 -08:00 |
|
Krrish Dholakia
|
e07cf33cba
|
test(test_streaming.py): add exception mapping for palm timeout error
|
2024-02-22 21:36:57 -08:00 |
|
Krrish Dholakia
|
b6017115e3
|
fix(utils.py): ensure argument is always a string
|
2024-02-22 15:15:56 -08:00 |
|
Krrish Dholakia
|
f6e52ac771
|
test: handle api errors for gemini/palm testing
|
2024-02-21 21:44:08 -08:00 |
|
Krrish Dholakia
|
c9c6547ef9
|
test(test_streaming.py): handle gemini 500 error
|
2024-02-21 21:32:03 -08:00 |
|
Krrish Dholakia
|
1d3bef2e9c
|
fix(gemini.py): implement custom streamer
|
2024-02-20 17:10:51 -08:00 |
|
Krrish Dholakia
|
45eb4a5fcc
|
fix(gemini.py): fix async streaming + add native async completions
|
2024-02-19 22:41:36 -08:00 |
|
Krrish Dholakia
|
5f9e141d1e
|
fix(huggingface_restapi.py): return streamed response correctly
|
2024-02-16 13:25:13 -08:00 |
|
Krrish Dholakia
|
eb45df16f1
|
fix(test_streaming.py): handle hf tgi zephyr not loading for streaming issue
|
2024-02-15 19:24:02 -08:00 |
|
Krrish Dholakia
|
b1bc30ee16
|
feat(sagemaker.py): aioboto3 streaming support
|
2024-02-12 21:18:34 -08:00 |
|
Krrish Dholakia
|
241f0aad5e
|
fix(utils.py): fix deepinfra streaming
|
2024-02-01 18:46:50 -08:00 |
|
Krrish Dholakia
|
874ebad31d
|
test(test_streaming.py): add test to make sure role is not returned in middle chunks during streaming
|
2024-02-01 09:55:56 -08:00 |
|
Krrish Dholakia
|
e948b39e3a
|
test(test_streaming.py): fix test to handle none chunk
|
2024-01-25 18:34:13 -08:00 |
|
Krrish Dholakia
|
09ec6d6458
|
fix(utils.py): fix sagemaker async logging for sync streaming
https://github.com/BerriAI/litellm/issues/1592
|
2024-01-25 12:49:45 -08:00 |
|
ishaan-jaff
|
64a387d09b
|
(test) test chunk_ids match across chunks for bedrock
|
2024-01-23 13:06:05 -08:00 |
|
ishaan-jaff
|
6c1f8378c1
|
(test) fix sagemaker stream test
|
2024-01-23 10:07:13 -08:00 |
|
Krrish Dholakia
|
b07677c6be
|
fix(gemini.py): support streaming
|
2024-01-19 20:21:34 -08:00 |
|
ishaan-jaff
|
4085c465f5
|
(ci/cd) skip streaming + hosted ollama flaky test
|
2024-01-16 15:01:56 -08:00 |
|
ishaan-jaff
|
d6f0cb8756
|
(ci/cd) fix olama hosted testing
|
2024-01-16 12:27:16 -08:00 |
|
ishaan-jaff
|
f85b64b85c
|
(test) custom_llm_provider in streaming response
|
2024-01-12 17:15:15 -08:00 |
|
Krrish Dholakia
|
e99a41307a
|
test: testing fixes
|
2024-01-09 10:23:34 +05:30 |
|
Krrish Dholakia
|
b1fd0a164b
|
fix(huggingface_restapi.py): support timeouts for huggingface + openai text completions
https://github.com/BerriAI/litellm/issues/1334
|
2024-01-08 11:40:56 +05:30 |
|
ishaan-jaff
|
6f9d3fc3bc
|
(ci/cd) retry hosted ollama + stream test 3 times
|
2024-01-05 18:02:20 +05:30 |
|
Krrish Dholakia
|
c9fdbaf898
|
fix(azure.py,-openai.py): correctly raise errors if streaming calls fail
|
2023-12-27 15:08:37 +05:30 |
|
Krrish Dholakia
|
4905929de3
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
|
Krrish Dholakia
|
402b2e5733
|
build(test_streaming.py): fix linting issues
|
2023-12-25 07:34:54 +05:30 |
|
Krrish Dholakia
|
343a06fd84
|
fix(proxy_server.py): raise streaming exceptions
|
2023-12-25 07:18:09 +05:30 |
|