Krrish Dholakia
|
be055f31ee
|
test(test_azure_astreaming_and_function_calling): fix test to handle caching
|
2024-03-25 19:33:57 -07:00 |
|
Krrish Dholakia
|
26dbb76d53
|
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
|
2024-03-25 16:33:41 -07:00 |
|
Krrish Dholakia
|
16368fec98
|
fix(anthropic.py): support async claude 3 tool calling + streaming
https://github.com/BerriAI/litellm/issues/2644
|
2024-03-22 19:57:01 -07:00 |
|
Krrish Dholakia
|
697758e188
|
test(test_streaming.py): fix test
|
2024-03-18 22:06:29 -07:00 |
|
ishaan-jaff
|
960d7ae2a4
|
(fix-ci-cd) skip deep infra 429 errors
|
2024-03-13 22:05:16 -07:00 |
|
Krrish Dholakia
|
0bdaf3e05b
|
test: set verbose for test
|
2024-03-12 12:56:38 -07:00 |
|
Krrish Dholakia
|
1c6438c267
|
fix(anthropic.py): support streaming with function calling
|
2024-03-12 09:52:11 -07:00 |
|
Krrish Dholakia
|
d8cf889597
|
fix(bedrock.py): enable claude-3 streaming
|
2024-03-09 14:02:27 -08:00 |
|
ishaan-jaff
|
f6e53f542c
|
(ci/cd) run again
|
2024-03-08 22:05:39 -08:00 |
|
Krrish Dholakia
|
3dd680a7cd
|
test(test_streaming.py): skip flaky test
|
2024-03-04 22:33:45 -08:00 |
|
Krrish Dholakia
|
0c00b57a5c
|
fix(test_streaming.py): skip flaky test
|
2024-03-04 20:46:21 -08:00 |
|
Krrish Dholakia
|
dad65ca602
|
fix(bedrock.py): support anthropic messages api on bedrock (claude-3)
|
2024-03-04 17:15:47 -08:00 |
|
ishaan-jaff
|
351c648dc0
|
(fix) test streaming
|
2024-03-04 08:29:16 -08:00 |
|
Krish Dholakia
|
7d2a9f9222
|
Merge pull request #2236 from BerriAI/litellm_faster_usage_logs_pg_db
fix(proxy_server.py): introduces a beta endpoint for admin to view global spend
|
2024-02-28 17:08:55 -08:00 |
|
Krrish Dholakia
|
129e84f0c5
|
test: removing aws tests - account suspended - pending their approval
|
2024-02-28 13:46:20 -08:00 |
|
ishaan-jaff
|
6b253fec7c
|
(test) hugging face test - their endpoint is unstable
|
2024-02-28 12:11:29 -08:00 |
|
Krrish Dholakia
|
032eb2f143
|
test(test_streaming.py): add more logging
|
2024-02-26 21:44:28 -08:00 |
|
Krrish Dholakia
|
4e608c86c1
|
fix(utils.py): fix streaming
|
2024-02-26 12:52:53 -08:00 |
|
Krrish Dholakia
|
dcca55159b
|
fix(utils.py): support returning caching streaming response for function calling streaming calls
|
2024-02-26 12:31:00 -08:00 |
|
Krrish Dholakia
|
74b97b23e9
|
fix(utils.py): stricter azure function calling tests
|
2024-02-23 20:55:32 -08:00 |
|
Krish Dholakia
|
858c21c96d
|
Merge branch 'main' into litellm_fix_azure_function_calling_streaming
|
2024-02-22 22:36:38 -08:00 |
|
Krrish Dholakia
|
38f7873d6f
|
test(test_streaming.py): fix openai streaming function calling test
|
2024-02-22 21:50:14 -08:00 |
|
Krrish Dholakia
|
5775526453
|
test(test_streaming.py): add exception mapping for palm timeout error
|
2024-02-22 21:36:57 -08:00 |
|
Krrish Dholakia
|
3b156888a6
|
fix(utils.py): ensure argument is always a string
|
2024-02-22 15:15:56 -08:00 |
|
Krrish Dholakia
|
75d30cd000
|
test: handle api errors for gemini/palm testing
|
2024-02-21 21:44:08 -08:00 |
|
Krrish Dholakia
|
540c9e7bae
|
test(test_streaming.py): handle gemini 500 error
|
2024-02-21 21:32:03 -08:00 |
|
Krrish Dholakia
|
13b35db8d7
|
fix(gemini.py): implement custom streamer
|
2024-02-20 17:10:51 -08:00 |
|
Krrish Dholakia
|
11c12e7381
|
fix(gemini.py): fix async streaming + add native async completions
|
2024-02-19 22:41:36 -08:00 |
|
Krrish Dholakia
|
9c2cce56f7
|
fix(huggingface_restapi.py): return streamed response correctly
|
2024-02-16 13:25:13 -08:00 |
|
Krrish Dholakia
|
f82a335118
|
fix(test_streaming.py): handle hf tgi zephyr not loading for streaming issue
|
2024-02-15 19:24:02 -08:00 |
|
Krrish Dholakia
|
5de569fcb1
|
feat(sagemaker.py): aioboto3 streaming support
|
2024-02-12 21:18:34 -08:00 |
|
Krrish Dholakia
|
0ab8b764cc
|
fix(utils.py): fix deepinfra streaming
|
2024-02-01 18:46:50 -08:00 |
|
Krrish Dholakia
|
bdc749baa0
|
test(test_streaming.py): add test to make sure role is not returned in middle chunks during streaming
|
2024-02-01 09:55:56 -08:00 |
|
Krrish Dholakia
|
2a9d8c1b70
|
test(test_streaming.py): fix test to handle none chunk
|
2024-01-25 18:34:13 -08:00 |
|
Krrish Dholakia
|
402235dc5d
|
fix(utils.py): fix sagemaker async logging for sync streaming
https://github.com/BerriAI/litellm/issues/1592
|
2024-01-25 12:49:45 -08:00 |
|
ishaan-jaff
|
d966e2dadd
|
(test) test chunk_ids match across chunks for bedrock
|
2024-01-23 13:06:05 -08:00 |
|
ishaan-jaff
|
e2e56c03f4
|
(test) fix sagemaker stream test
|
2024-01-23 10:07:13 -08:00 |
|
Krrish Dholakia
|
2ecc2f12cd
|
fix(gemini.py): support streaming
|
2024-01-19 20:21:34 -08:00 |
|
ishaan-jaff
|
af5cbf75f8
|
(ci/cd) skip streaming + hosted ollama flaky test
|
2024-01-16 15:01:56 -08:00 |
|
ishaan-jaff
|
7226fdff57
|
(ci/cd) fix olama hosted testing
|
2024-01-16 12:27:16 -08:00 |
|
ishaan-jaff
|
99474dbf33
|
(test) custom_llm_provider in streaming response
|
2024-01-12 17:15:15 -08:00 |
|
Krrish Dholakia
|
49e9604de1
|
test: testing fixes
|
2024-01-09 10:23:34 +05:30 |
|
Krrish Dholakia
|
e4a5a3395c
|
fix(huggingface_restapi.py): support timeouts for huggingface + openai text completions
https://github.com/BerriAI/litellm/issues/1334
|
2024-01-08 11:40:56 +05:30 |
|
ishaan-jaff
|
89c80f2a60
|
(ci/cd) retry hosted ollama + stream test 3 times
|
2024-01-05 18:02:20 +05:30 |
|
Krrish Dholakia
|
fd5e6efb1d
|
fix(azure.py,-openai.py): correctly raise errors if streaming calls fail
|
2023-12-27 15:08:37 +05:30 |
|
Krrish Dholakia
|
79978c44ba
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
|
Krrish Dholakia
|
72e8c84914
|
build(test_streaming.py): fix linting issues
|
2023-12-25 07:34:54 +05:30 |
|
Krrish Dholakia
|
6d73a77b01
|
fix(proxy_server.py): raise streaming exceptions
|
2023-12-25 07:18:09 +05:30 |
|
Krrish Dholakia
|
a65dfdde94
|
test(test_completion.py-+-test_streaming.py): add ollama endpoint to ci/cd pipeline
|
2023-12-22 12:21:33 +05:30 |
|
ishaan-jaff
|
4fb2dd7cae
|
(test) mistral api streaming
|
2023-12-14 18:28:03 +05:30 |
|