Krrish Dholakia
|
89273722ba
|
fix(bedrock_httpx.py): handle empty arguments returned during tool calling streaming
|
2024-08-07 09:54:50 -07:00 |
|
Krrish Dholakia
|
ec4051592b
|
fix(anthropic.py): handle scenario where anthropic returns invalid json string for tool call while streaming
Fixes https://github.com/BerriAI/litellm/issues/5063
|
2024-08-07 09:24:11 -07:00 |
|
Krrish Dholakia
|
a502914f13
|
fix(utils.py): fix anthropic streaming usage calculation
Fixes https://github.com/BerriAI/litellm/issues/4965
|
2024-08-01 14:45:54 -07:00 |
|
Ishaan Jaff
|
557881326f
|
testing fix - skip rate limit errors from anthropic api
|
2024-08-01 12:12:50 -07:00 |
|
Ishaan Jaff
|
90aea04d40
|
use timeouts for predibase - never use them in prod !
|
2024-07-30 22:54:01 -07:00 |
|
Ishaan Jaff
|
1b15eb7be5
|
fix predibase tests
|
2024-07-30 22:49:33 -07:00 |
|
Krrish Dholakia
|
644bc5049c
|
test: cleanup duplicate tests + add error handling for backend api errors
|
2024-07-30 21:47:52 -07:00 |
|
Krrish Dholakia
|
767308562a
|
test(test_streaming.py): fix streaming test
|
2024-07-30 20:36:19 -07:00 |
|
Ishaan Jaff
|
a8ef88039a
|
handle predibase failing streaming tests
|
2024-07-30 17:53:27 -07:00 |
|
Krrish Dholakia
|
42d065c093
|
test(test_streaming.py): move to mock implementation for sagemaker streaming tests
|
2024-07-30 15:34:27 -07:00 |
|
Krrish Dholakia
|
615c8f48e3
|
test(test_streaming.py): handle predibase instability
|
2024-07-30 15:32:36 -07:00 |
|
Krrish Dholakia
|
e7744177cb
|
fix(utils.py): don't raise error on openai content filter during streaming - return as is
Fixes issue where we would raise an error vs. openai who return the chunk with finish reason as 'content_filter'
|
2024-07-25 19:50:52 -07:00 |
|
Krrish Dholakia
|
271407400a
|
fix(utils.py): support raw response headers for streaming requests
|
2024-07-23 11:58:58 -07:00 |
|
Krrish Dholakia
|
a01a45230c
|
fix(openai.py): support completion, streaming, async_streaming
|
2024-07-20 15:23:42 -07:00 |
|
Krrish Dholakia
|
eb4dcf67bc
|
fix(bedrock_httpx.py): support jamba streaming
|
2024-07-18 19:36:50 -07:00 |
|
Ishaan Jaff
|
988d2c7fdf
|
shorten predibase streaming test
|
2024-07-17 17:50:42 -07:00 |
|
Krrish Dholakia
|
d85f24a80b
|
fix(utils.py): fix recreating model response object when stream usage is true
|
2024-07-11 21:01:12 -07:00 |
|
Ishaan Jaff
|
a94be8b616
|
fix test_bedrock_httpx_streaming
|
2024-07-10 17:14:53 -07:00 |
|
Krish Dholakia
|
c643be0c0c
|
Merge branch 'main' into litellm_gemini_stream_tool_calling
|
2024-07-06 19:07:31 -07:00 |
|
Ishaan Jaff
|
0b24027da2
|
test had "assert False" for some reason ?
|
2024-07-06 15:46:28 -07:00 |
|
Krrish Dholakia
|
9f900a1bed
|
fix(vertex_httpx.py): support tool calling w/ streaming for vertex ai + gemini
|
2024-07-06 14:02:25 -07:00 |
|
Krrish Dholakia
|
00497b408d
|
fix(anthropic.py): fix anthropic tool calling + streaming
Fixes https://github.com/BerriAI/litellm/issues/4537
|
2024-07-04 16:30:24 -07:00 |
|
Krrish Dholakia
|
84044c08b6
|
fix(utils.py): stream_options working across all providers
|
2024-07-03 20:40:46 -07:00 |
|
Krrish Dholakia
|
eae049d059
|
fix(anthropic.py): support *real* anthropic tool calling + streaming
Parses each chunk and translates to openai format
|
2024-07-03 19:48:35 -07:00 |
|
Krrish Dholakia
|
3fdfb8470d
|
fix(replicate.py): correctly handly invalid replicate response
|
2024-07-03 10:09:46 -07:00 |
|
Ishaan Jaff
|
1a0f2aff24
|
fix test_completion_replicate_llama3_streaming
|
2024-06-29 19:51:07 -07:00 |
|
Krrish Dholakia
|
0029d437cc
|
test(test_streaming.py): try-except replicate api instability
|
2024-06-28 22:19:44 -07:00 |
|
Krrish Dholakia
|
97ad7eb535
|
fix(cost_calculator.py): handle unexpected error in cost_calculator.py
|
2024-06-28 14:53:00 -07:00 |
|
Krrish Dholakia
|
66e3a4f30e
|
fix(support-'alt=sse'-param): Fixes https://github.com/BerriAI/litellm/issues/4459
|
2024-06-28 10:40:47 -07:00 |
|
Krrish Dholakia
|
14fdbf26a6
|
fix(vertex_httpx.py): flush remaining chunks from stream
|
2024-06-22 20:33:54 -07:00 |
|
Krrish Dholakia
|
2d8135231f
|
fix(vertex_httpx.py): ignore vertex finish reason - wait for stream to end
Fixes https://github.com/BerriAI/litellm/issues/4339
|
2024-06-22 20:21:01 -07:00 |
|
Krrish Dholakia
|
a6716177fa
|
test: cleanup tests
|
2024-06-17 16:33:58 -07:00 |
|
Krrish Dholakia
|
58907917b6
|
test(test_streaming.py): reduce vertex tests to avoid exhausting limits
|
2024-06-13 07:30:53 -07:00 |
|
Krish Dholakia
|
50c74fce49
|
Merge branch 'main' into litellm_vertex_completion_httpx
|
2024-06-12 21:19:22 -07:00 |
|
Ishaan Jaff
|
f70a3cc588
|
ci/cd predibase is an unstable provider
|
2024-06-12 21:10:30 -07:00 |
|
Krrish Dholakia
|
1dac2aa59f
|
fix(vertex_httpx.py): support streaming via httpx client
|
2024-06-12 19:55:14 -07:00 |
|
Ishaan Jaff
|
2622f33bbd
|
ci/cd fix predibase 500 errors
|
2024-06-11 23:15:48 -07:00 |
|
Krrish Dholakia
|
5056fd5778
|
fix(bedrock_httpx.py): returning correct finish reason on streaming completion
|
2024-06-10 14:47:49 -07:00 |
|
Krrish Dholakia
|
2d95eaa5bc
|
fix(bedrock_httpx.py): fix tool calling for anthropic bedrock calls w/ streaming
Fixes https://github.com/BerriAI/litellm/issues/4091
|
2024-06-10 14:20:25 -07:00 |
|
Krrish Dholakia
|
12ed3dc911
|
refactor(main.py): only route anthropic calls through converse api
v0 scope let's move function calling to converse api
|
2024-06-07 08:47:51 -07:00 |
|
Krrish Dholakia
|
c41b60f6bf
|
feat(bedrock_httpx.py): working bedrock converse api streaming
|
2024-06-06 22:13:21 -07:00 |
|
Krish Dholakia
|
e678dce88b
|
Merge pull request #4009 from BerriAI/litellm_fix_streaming_cost_cal
fix(utils.py): fix cost calculation for openai-compatible streaming object
|
2024-06-04 21:00:22 -07:00 |
|
Krrish Dholakia
|
1336957077
|
fix(utils.py): fix
|
2024-06-04 19:41:20 -07:00 |
|
Krrish Dholakia
|
e279498970
|
feat(utils.py): support 'stream_options' param across all providers
Closes https://github.com/BerriAI/litellm/issues/3553
|
2024-06-04 19:03:26 -07:00 |
|
Krrish Dholakia
|
d74ccc6c84
|
fix(utils.py): fix stream options to return consistent response object
|
2024-06-04 18:17:45 -07:00 |
|
Krrish Dholakia
|
7432c6a4d9
|
fix(utils.py): fix cost calculation for openai-compatible streaming object
|
2024-06-04 10:36:25 -07:00 |
|
Ishaan Jaff
|
e21408717c
|
test - test gpt-3.5-turbo on test_parallel_streaming_requests
|
2024-06-01 20:56:33 -07:00 |
|
Krrish Dholakia
|
594daef07a
|
fix(utils.py): correctly instrument passing through api version in optional param check
|
2024-06-01 19:31:52 -07:00 |
|
Krrish Dholakia
|
ca82b82d4b
|
fix(test_scheduler.py): fix test
|
2024-06-01 11:30:26 -07:00 |
|
Krrish Dholakia
|
3896e3e88f
|
fix: fix streaming with httpx client
prevent overwriting streams in parallel streaming calls
|
2024-05-31 10:55:18 -07:00 |
|