Commit graph

1311 commits

Author SHA1 Message Date
Krrish Dholakia
9e1e97528d fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
2024-03-25 16:33:41 -07:00
Krrish Dholakia
f153889738 fix(utils.py): allow user to disable streaming logging
fixes event loop issue for litellm.disable_streaming_logging
2024-03-25 14:28:46 -07:00
Max Deichmann
efb43ccd02 push 2024-03-25 17:43:55 +01:00
Krrish Dholakia
eb3ca85d7e feat(router.py): enable pre-call checks
filter models outside of context window limits of a given message for a model group

 https://github.com/BerriAI/litellm/issues/872
2024-03-23 18:03:30 -07:00
Tasha Upchurch
ab919004a2
Update utils.py
fix for constructed from dict choices.message being a dict still instead of Message class.
2024-03-23 00:12:24 -04:00
Ishaan Jaff
f39f606e02 (feat) remove litellm.telemetry 2024-03-22 20:58:14 -07:00
Tasha Upchurch
79201449d2
Update utils.py
Fix for creating an empty choices if no choices passed in
2024-03-22 23:39:17 -04:00
Tasha Upchurch
9201c122e7
Update utils.py
fix for #2655
2024-03-22 23:13:24 -04:00
Krrish Dholakia
691a83b7dc fix(anthropic.py): handle multiple system prompts 2024-03-22 18:14:15 -07:00
Vincelwt
29e8c144fb
Merge branch 'main' into main 2024-03-22 00:52:42 +09:00
Ishaan Jaff
0558ac2345 (fix) don't run .completion retries if using router / proxy 2024-03-21 08:32:42 -07:00
Krrish Dholakia
524c244dd9 fix(utils.py): support response_format param for ollama
https://github.com/BerriAI/litellm/issues/2580
2024-03-19 21:07:20 -07:00
Ishaan Jaff
c196186190 (fix) add /metrics to utils.py 2024-03-19 17:28:33 -07:00
Vincelwt
1cbfd312fe
Merge branch 'main' into main 2024-03-19 12:50:04 +09:00
Krish Dholakia
c669943292
Merge pull request #2577 from BerriAI/litellm_vertex_ai_streaming_func_call
feat(vertex_ai.py): support gemini (vertex ai) function calling when streaming
2024-03-18 20:10:00 -07:00
Ishaan Jaff
ac826851fa (feat) v0 datadog logger 2024-03-18 16:01:47 -07:00
Krrish Dholakia
f49dc9a99f feat(vertex_ai.py): support gemini (vertex ai) function calling when streaming 2024-03-18 11:47:27 -07:00
Krrish Dholakia
bc66ef9d5c fix(utils.py): fix aws secret manager + support key_management_settings
fixes the aws secret manager implementation and allows the user to set which keys they want to check thr
ough it
2024-03-16 16:47:50 -07:00
Krrish Dholakia
d8956e9255 fix(utils.py): initial commit for aws secret manager support 2024-03-16 14:37:46 -07:00
Krrish Dholakia
909341c4f2 fix(utils.py): async add to cache - for streaming 2024-03-15 18:25:40 -07:00
Krrish Dholakia
226953e1d8 feat(batch_redis_get.py): batch redis GET requests for a given key + call type
reduces the number of GET requests we're making in high-throughput scenarios
2024-03-15 14:40:11 -07:00
Krrish Dholakia
9909f44015 feat(utils.py): add native fireworks ai support
addresses - https://github.com/BerriAI/litellm/issues/777, https://github.com/BerriAI/litellm/issues/2486
2024-03-15 09:09:59 -07:00
Krrish Dholakia
a634424fb2 fix(utils.py): move to using litellm.modify_params to enable max output token trimming fix 2024-03-14 12:17:56 -07:00
Krrish Dholakia
234cdbbfef feat(prompt_injection_detection.py): support simple heuristic similarity check for prompt injection attacks 2024-03-13 10:32:21 -07:00
Krish Dholakia
9f2d540ebf
Merge pull request #2472 from BerriAI/litellm_anthropic_streaming_tool_calling
fix(anthropic.py): support claude-3 streaming with function calling
2024-03-12 21:36:01 -07:00
Ishaan Jaff
7b4f9691c7
Merge pull request #2479 from BerriAI/litellm_cohere_tool_call
[FEAT Cohere/command-r tool calling
2024-03-12 21:20:59 -07:00
Krrish Dholakia
d620b4dc5d fix(anthropic.py): bug fix 2024-03-12 19:32:42 -07:00
ishaan-jaff
b9bfc7c36c (fix) use cohere_chat optional params 2024-03-12 14:31:43 -07:00
Krish Dholakia
0d18f3c0ca
Merge pull request #2473 from BerriAI/litellm_fix_compatible_provider_model_name
fix(openai.py): return model name with custom llm provider for openai-compatible endpoints (e.g. mistral, together ai, etc.)
2024-03-12 12:58:29 -07:00
Krish Dholakia
1ba102c618
Merge pull request #2475 from BerriAI/litellm_azure_dall_e_3_cost_tracking
fix(azure.py): support cost tracking for azure/dall-e-3
2024-03-12 12:57:31 -07:00
ishaan-jaff
a18c941621 (fix) failing cohere test 2024-03-12 12:44:19 -07:00
ishaan-jaff
d136238f6f (v0) tool calling 2024-03-12 12:35:52 -07:00
Krrish Dholakia
d07c813ef9 test: add more logging for failing test 2024-03-12 11:15:14 -07:00
Ishaan Jaff
5172fb1de9
Merge pull request #2474 from BerriAI/litellm_support_command_r
[New-Model] Cohere/command-r
2024-03-12 11:11:56 -07:00
Krrish Dholakia
7dd94c802e fix(azure.py): support cost tracking for azure/dall-e-3 2024-03-12 10:55:54 -07:00
ishaan-jaff
e5bb65669d (feat) exception mapping for cohere_chat 2024-03-12 10:45:42 -07:00
Krrish Dholakia
0033613b9e fix(openai.py): return model name with custom llm provider for openai compatible endpoints 2024-03-12 10:30:10 -07:00
Krrish Dholakia
86ed0aaba8 fix(anthropic.py): support streaming with function calling 2024-03-12 09:52:11 -07:00
ishaan-jaff
223ac464d7 (fix) support streaming for azure/instruct models 2024-03-12 09:50:43 -07:00
Krrish Dholakia
312a9d8c26 fix(utils.py): support response_format for mistral ai api 2024-03-11 10:23:41 -07:00
Vince Loewe
7c38f992dc
Merge branch 'main' into main 2024-03-11 12:36:41 +09:00
Krish Dholakia
c7d0af0a2e
Merge pull request #2426 from BerriAI/litellm_whisper_cost_tracking
feat: add cost tracking + caching for `/audio/transcription` calls
2024-03-09 19:12:06 -08:00
Krrish Dholakia
1d15dde6de fix(utils.py): fix model setting in completion cost 2024-03-09 19:11:37 -08:00
Krrish Dholakia
8d2d51b625 fix(utils.py): fix model name checking 2024-03-09 18:22:26 -08:00
Krrish Dholakia
fa45c569fd feat: add cost tracking + caching for transcription calls 2024-03-09 15:43:38 -08:00
Krrish Dholakia
8b24ddcbbd fix(bedrock.py): enable claude-3 streaming 2024-03-09 14:02:27 -08:00
Krish Dholakia
caa99f43bf
Merge branch 'main' into litellm_load_balancing_transcription_endpoints 2024-03-08 23:08:47 -08:00
Krish Dholakia
e245b1c98a
Merge pull request #2401 from BerriAI/litellm_transcription_endpoints
feat(main.py): support openai transcription endpoints
2024-03-08 23:07:48 -08:00
Krrish Dholakia
fd52b502a6 fix(utils.py): *new* get_supported_openai_params() function
Returns the supported openai params for a given model + provider
2024-03-08 23:06:40 -08:00
Krrish Dholakia
aeb3cbc9b6 fix(utils.py): add additional providers to get_supported_openai_params 2024-03-08 23:06:40 -08:00