Commit graph

141 commits

Author SHA1 Message Date
Krish Dholakia
7d2a9f9222 Merge pull request #2236 from BerriAI/litellm_faster_usage_logs_pg_db
fix(proxy_server.py): introduces a beta endpoint for admin to view global spend
2024-02-28 17:08:55 -08:00
Krrish Dholakia
129e84f0c5 test: removing aws tests - account suspended - pending their approval 2024-02-28 13:46:20 -08:00
ishaan-jaff
6b253fec7c (test) hugging face test - their endpoint is unstable 2024-02-28 12:11:29 -08:00
Krrish Dholakia
032eb2f143 test(test_streaming.py): add more logging 2024-02-26 21:44:28 -08:00
Krrish Dholakia
4e608c86c1 fix(utils.py): fix streaming 2024-02-26 12:52:53 -08:00
Krrish Dholakia
dcca55159b fix(utils.py): support returning caching streaming response for function calling streaming calls 2024-02-26 12:31:00 -08:00
Krrish Dholakia
74b97b23e9 fix(utils.py): stricter azure function calling tests 2024-02-23 20:55:32 -08:00
Krish Dholakia
858c21c96d Merge branch 'main' into litellm_fix_azure_function_calling_streaming 2024-02-22 22:36:38 -08:00
Krrish Dholakia
38f7873d6f test(test_streaming.py): fix openai streaming function calling test 2024-02-22 21:50:14 -08:00
Krrish Dholakia
5775526453 test(test_streaming.py): add exception mapping for palm timeout error 2024-02-22 21:36:57 -08:00
Krrish Dholakia
3b156888a6 fix(utils.py): ensure argument is always a string 2024-02-22 15:15:56 -08:00
Krrish Dholakia
75d30cd000 test: handle api errors for gemini/palm testing 2024-02-21 21:44:08 -08:00
Krrish Dholakia
540c9e7bae test(test_streaming.py): handle gemini 500 error 2024-02-21 21:32:03 -08:00
Krrish Dholakia
13b35db8d7 fix(gemini.py): implement custom streamer 2024-02-20 17:10:51 -08:00
Krrish Dholakia
11c12e7381 fix(gemini.py): fix async streaming + add native async completions 2024-02-19 22:41:36 -08:00
Krrish Dholakia
9c2cce56f7 fix(huggingface_restapi.py): return streamed response correctly 2024-02-16 13:25:13 -08:00
Krrish Dholakia
f82a335118 fix(test_streaming.py): handle hf tgi zephyr not loading for streaming issue 2024-02-15 19:24:02 -08:00
Krrish Dholakia
5de569fcb1 feat(sagemaker.py): aioboto3 streaming support 2024-02-12 21:18:34 -08:00
Krrish Dholakia
0ab8b764cc fix(utils.py): fix deepinfra streaming 2024-02-01 18:46:50 -08:00
Krrish Dholakia
bdc749baa0 test(test_streaming.py): add test to make sure role is not returned in middle chunks during streaming 2024-02-01 09:55:56 -08:00
Krrish Dholakia
2a9d8c1b70 test(test_streaming.py): fix test to handle none chunk 2024-01-25 18:34:13 -08:00
Krrish Dholakia
402235dc5d fix(utils.py): fix sagemaker async logging for sync streaming
https://github.com/BerriAI/litellm/issues/1592
2024-01-25 12:49:45 -08:00
ishaan-jaff
d966e2dadd (test) test chunk_ids match across chunks for bedrock 2024-01-23 13:06:05 -08:00
ishaan-jaff
e2e56c03f4 (test) fix sagemaker stream test 2024-01-23 10:07:13 -08:00
Krrish Dholakia
2ecc2f12cd fix(gemini.py): support streaming 2024-01-19 20:21:34 -08:00
ishaan-jaff
af5cbf75f8 (ci/cd) skip streaming + hosted ollama flaky test 2024-01-16 15:01:56 -08:00
ishaan-jaff
7226fdff57 (ci/cd) fix olama hosted testing 2024-01-16 12:27:16 -08:00
ishaan-jaff
99474dbf33 (test) custom_llm_provider in streaming response 2024-01-12 17:15:15 -08:00
Krrish Dholakia
49e9604de1 test: testing fixes 2024-01-09 10:23:34 +05:30
Krrish Dholakia
e4a5a3395c fix(huggingface_restapi.py): support timeouts for huggingface + openai text completions
https://github.com/BerriAI/litellm/issues/1334
2024-01-08 11:40:56 +05:30
ishaan-jaff
89c80f2a60 (ci/cd) retry hosted ollama + stream test 3 times 2024-01-05 18:02:20 +05:30
Krrish Dholakia
fd5e6efb1d fix(azure.py,-openai.py): correctly raise errors if streaming calls fail 2023-12-27 15:08:37 +05:30
Krrish Dholakia
79978c44ba refactor: add black formatting 2023-12-25 14:11:20 +05:30
Krrish Dholakia
72e8c84914 build(test_streaming.py): fix linting issues 2023-12-25 07:34:54 +05:30
Krrish Dholakia
6d73a77b01 fix(proxy_server.py): raise streaming exceptions 2023-12-25 07:18:09 +05:30
Krrish Dholakia
a65dfdde94 test(test_completion.py-+-test_streaming.py): add ollama endpoint to ci/cd pipeline 2023-12-22 12:21:33 +05:30
ishaan-jaff
4fb2dd7cae (test) mistral api streaming 2023-12-14 18:28:03 +05:30
Krrish Dholakia
c0d42ab5b1 test: refactor testing 2023-12-12 16:21:41 -08:00
Krrish Dholakia
d0e01d7e7a fix(utils.py): flush holding chunk for streaming, on stream end 2023-12-12 16:13:31 -08:00
Krrish Dholakia
243b914965 test(test_streaming.py): stricter output format testing 2023-12-12 00:16:48 -08:00
Krrish Dholakia
2c49bd4c60 test(test_streaming.py): add testing for azure output chunk 2023-12-11 23:32:38 -08:00
Krrish Dholakia
a94b6b5cd5 test(test_streaming.py): refactor testing 2023-11-29 19:58:04 -08:00
Krrish Dholakia
a1ea893a73 fix(main.py): don't pass stream to petals 2023-11-29 19:58:04 -08:00
Krrish Dholakia
2a5592abe7 fix(bedrock.py): support ai21 / bedrock streaming 2023-11-29 16:35:06 -08:00
Krrish Dholakia
ff865ab2c5 fix(azure.py): fix error handling for openai/azure streaming 2023-11-29 11:52:24 -08:00
Krrish Dholakia
5450fb26f4 fix(utils.py): azure tool calling streaming 2023-11-27 19:07:38 -08:00
Krrish Dholakia
0b2c1496d2 fix: fix nlp cloud streaming 2023-11-25 13:45:23 -08:00
Krrish Dholakia
20ca1b32eb bump: version 1.7.0 → 1.7.1 2023-11-25 12:34:28 -08:00
Krrish Dholakia
90385ef6a5 fix(utils.py): remove eos token for zephyr models 2023-11-23 17:47:39 -08:00
ishaan-jaff
7e6c37962b (test) update testing streaming for open ai v1.0.0+ 2023-11-18 19:04:41 -08:00