Krrish Dholakia
|
3a1b3227d8
|
fix(utils.py): ignore none chunk in stream infinite loop check
Fixes https://github.com/BerriAI/litellm/issues/5158#issuecomment-2287156946
|
2024-08-13 15:06:44 -07:00 |
|
Krrish Dholakia
|
fdd9a07051
|
fix(utils.py): Break out of infinite streaming loop
Fixes https://github.com/BerriAI/litellm/issues/5158
|
2024-08-12 14:00:43 -07:00 |
|
Krrish Dholakia
|
3cf9148a4a
|
test: add vertex claude to streaming valid json str test
|
2024-08-07 10:18:17 -07:00 |
|
Krrish Dholakia
|
2ccb5a48b7
|
fix(bedrock_httpx.py): handle empty arguments returned during tool calling streaming
|
2024-08-07 09:54:50 -07:00 |
|
Krrish Dholakia
|
4919cc4d25
|
fix(anthropic.py): handle scenario where anthropic returns invalid json string for tool call while streaming
Fixes https://github.com/BerriAI/litellm/issues/5063
|
2024-08-07 09:24:11 -07:00 |
|
Krrish Dholakia
|
ca0a0bed46
|
fix(utils.py): fix anthropic streaming usage calculation
Fixes https://github.com/BerriAI/litellm/issues/4965
|
2024-08-01 14:45:54 -07:00 |
|
Ishaan Jaff
|
dce8e3efb8
|
testing fix - skip rate limit errors from anthropic api
|
2024-08-01 12:12:50 -07:00 |
|
Ishaan Jaff
|
54e1f18832
|
use timeouts for predibase - never use them in prod !
|
2024-07-30 22:54:01 -07:00 |
|
Ishaan Jaff
|
19ab0614c4
|
fix predibase tests
|
2024-07-30 22:49:33 -07:00 |
|
Krrish Dholakia
|
24395492aa
|
test: cleanup duplicate tests + add error handling for backend api errors
|
2024-07-30 21:47:52 -07:00 |
|
Krrish Dholakia
|
eecd93c81d
|
test(test_streaming.py): fix streaming test
|
2024-07-30 20:36:19 -07:00 |
|
Ishaan Jaff
|
6c0506a144
|
handle predibase failing streaming tests
|
2024-07-30 17:53:27 -07:00 |
|
Krrish Dholakia
|
84513c0254
|
test(test_streaming.py): move to mock implementation for sagemaker streaming tests
|
2024-07-30 15:34:27 -07:00 |
|
Krrish Dholakia
|
e1cbb397fe
|
test(test_streaming.py): handle predibase instability
|
2024-07-30 15:32:36 -07:00 |
|
Krrish Dholakia
|
a2fd8459fc
|
fix(utils.py): don't raise error on openai content filter during streaming - return as is
Fixes issue where we would raise an error vs. openai who return the chunk with finish reason as 'content_filter'
|
2024-07-25 19:50:52 -07:00 |
|
Krrish Dholakia
|
f64a3309d1
|
fix(utils.py): support raw response headers for streaming requests
|
2024-07-23 11:58:58 -07:00 |
|
Krrish Dholakia
|
a27454b8e3
|
fix(openai.py): support completion, streaming, async_streaming
|
2024-07-20 15:23:42 -07:00 |
|
Krrish Dholakia
|
96471c145e
|
fix(bedrock_httpx.py): support jamba streaming
|
2024-07-18 19:36:50 -07:00 |
|
Ishaan Jaff
|
3110abc832
|
shorten predibase streaming test
|
2024-07-17 17:50:42 -07:00 |
|
Krrish Dholakia
|
b2e46086dd
|
fix(utils.py): fix recreating model response object when stream usage is true
|
2024-07-11 21:01:12 -07:00 |
|
Ishaan Jaff
|
7efe9beac5
|
fix test_bedrock_httpx_streaming
|
2024-07-10 17:14:53 -07:00 |
|
Krish Dholakia
|
65bee737c5
|
Merge branch 'main' into litellm_gemini_stream_tool_calling
|
2024-07-06 19:07:31 -07:00 |
|
Ishaan Jaff
|
482913c2e9
|
test had "assert False" for some reason ?
|
2024-07-06 15:46:28 -07:00 |
|
Krrish Dholakia
|
faa88a1ab1
|
fix(vertex_httpx.py): support tool calling w/ streaming for vertex ai + gemini
|
2024-07-06 14:02:25 -07:00 |
|
Krrish Dholakia
|
f2dabc65be
|
fix(anthropic.py): fix anthropic tool calling + streaming
Fixes https://github.com/BerriAI/litellm/issues/4537
|
2024-07-04 16:30:24 -07:00 |
|
Krrish Dholakia
|
2e5a81f280
|
fix(utils.py): stream_options working across all providers
|
2024-07-03 20:40:46 -07:00 |
|
Krrish Dholakia
|
344268e053
|
fix(anthropic.py): support *real* anthropic tool calling + streaming
Parses each chunk and translates to openai format
|
2024-07-03 19:48:35 -07:00 |
|
Krrish Dholakia
|
d18dc2648c
|
fix(replicate.py): correctly handly invalid replicate response
|
2024-07-03 10:09:46 -07:00 |
|
Ishaan Jaff
|
03dbc29c85
|
fix test_completion_replicate_llama3_streaming
|
2024-06-29 19:51:07 -07:00 |
|
Krrish Dholakia
|
831745e710
|
test(test_streaming.py): try-except replicate api instability
|
2024-06-28 22:19:44 -07:00 |
|
Krrish Dholakia
|
0001683036
|
fix(cost_calculator.py): handle unexpected error in cost_calculator.py
|
2024-06-28 14:53:00 -07:00 |
|
Krrish Dholakia
|
a7122f91a1
|
fix(support-'alt=sse'-param): Fixes https://github.com/BerriAI/litellm/issues/4459
|
2024-06-28 10:40:47 -07:00 |
|
Krrish Dholakia
|
0fd9033502
|
fix(vertex_httpx.py): flush remaining chunks from stream
|
2024-06-22 20:33:54 -07:00 |
|
Krrish Dholakia
|
73254987da
|
fix(vertex_httpx.py): ignore vertex finish reason - wait for stream to end
Fixes https://github.com/BerriAI/litellm/issues/4339
|
2024-06-22 20:21:01 -07:00 |
|
Krrish Dholakia
|
1b215d704d
|
test: cleanup tests
|
2024-06-17 16:33:58 -07:00 |
|
Krrish Dholakia
|
c5df662e0f
|
test(test_streaming.py): reduce vertex tests to avoid exhausting limits
|
2024-06-13 07:30:53 -07:00 |
|
Krish Dholakia
|
05e21441a6
|
Merge branch 'main' into litellm_vertex_completion_httpx
|
2024-06-12 21:19:22 -07:00 |
|
Ishaan Jaff
|
29e5e3c265
|
ci/cd predibase is an unstable provider
|
2024-06-12 21:10:30 -07:00 |
|
Krrish Dholakia
|
3955b058ed
|
fix(vertex_httpx.py): support streaming via httpx client
|
2024-06-12 19:55:14 -07:00 |
|
Ishaan Jaff
|
5ebeb4d067
|
ci/cd fix predibase 500 errors
|
2024-06-11 23:15:48 -07:00 |
|
Krrish Dholakia
|
65baa83928
|
fix(bedrock_httpx.py): returning correct finish reason on streaming completion
|
2024-06-10 14:47:49 -07:00 |
|
Krrish Dholakia
|
84652dd946
|
fix(bedrock_httpx.py): fix tool calling for anthropic bedrock calls w/ streaming
Fixes https://github.com/BerriAI/litellm/issues/4091
|
2024-06-10 14:20:25 -07:00 |
|
Krrish Dholakia
|
35e4323095
|
refactor(main.py): only route anthropic calls through converse api
v0 scope let's move function calling to converse api
|
2024-06-07 08:47:51 -07:00 |
|
Krrish Dholakia
|
51ba5652a0
|
feat(bedrock_httpx.py): working bedrock converse api streaming
|
2024-06-06 22:13:21 -07:00 |
|
Krish Dholakia
|
c544ba3654
|
Merge pull request #4009 from BerriAI/litellm_fix_streaming_cost_cal
fix(utils.py): fix cost calculation for openai-compatible streaming object
|
2024-06-04 21:00:22 -07:00 |
|
Krrish Dholakia
|
43af5575c8
|
fix(utils.py): fix
|
2024-06-04 19:41:20 -07:00 |
|
Krrish Dholakia
|
54dacfdf61
|
feat(utils.py): support 'stream_options' param across all providers
Closes https://github.com/BerriAI/litellm/issues/3553
|
2024-06-04 19:03:26 -07:00 |
|
Krrish Dholakia
|
9aa29854de
|
fix(utils.py): fix stream options to return consistent response object
|
2024-06-04 18:17:45 -07:00 |
|
Krrish Dholakia
|
52a2f5150c
|
fix(utils.py): fix cost calculation for openai-compatible streaming object
|
2024-06-04 10:36:25 -07:00 |
|
Ishaan Jaff
|
2f9dac6238
|
test - test gpt-3.5-turbo on test_parallel_streaming_requests
|
2024-06-01 20:56:33 -07:00 |
|