Commit graph

1539 commits

Author SHA1 Message Date
Krish Dholakia
f6a262122b
Merge pull request #4054 from BerriAI/litellm_aws_kms_support
feat(aws_secret_manager.py): Support AWS KMS for Master Key encrption
2024-06-07 08:49:25 -07:00
Krrish Dholakia
35e4323095 refactor(main.py): only route anthropic calls through converse api
v0 scope let's move function calling to converse api
2024-06-07 08:47:51 -07:00
Krish Dholakia
471be6670c
Merge pull request #4049 from BerriAI/litellm_cleanup_traceback
refactor: replace 'traceback.print_exc()' with logging library
2024-06-07 08:03:22 -07:00
Krish Dholakia
b6e0bf27b8
Merge branch 'main' into litellm_aws_kms_support 2024-06-07 07:58:56 -07:00
Krrish Dholakia
51ba5652a0 feat(bedrock_httpx.py): working bedrock converse api streaming 2024-06-06 22:13:21 -07:00
Krrish Dholakia
a995a0b172 fix(bedrock_httpx.py): working claude 3 function calling 2024-06-06 20:12:41 -07:00
Krrish Dholakia
6e9bca59b0 fix(utils.py): fix exception mapping for azure internal server error 2024-06-06 17:12:30 -07:00
Krish Dholakia
1742141fb6
Merge pull request #4046 from BerriAI/litellm_router_order
feat(router.py): enable settting 'order' for a deployment in model list
2024-06-06 16:37:03 -07:00
Krish Dholakia
677e0255c8
Merge branch 'main' into litellm_cleanup_traceback 2024-06-06 16:32:08 -07:00
Krrish Dholakia
a2da2a8f16 feat(aws_secret_manager.py): allows user to keep a hash of the proxy master key in their env 2024-06-06 15:32:51 -07:00
Krrish Dholakia
6cca5612d2 refactor: replace 'traceback.print_exc()' with logging library
allows error logs to be in json format for otel logging
2024-06-06 13:47:43 -07:00
Ishaan Jaff
1e8429bb20 feat - redact messages from slack alerting 2024-06-06 10:38:15 -07:00
Raymond1415926
38b44c301a
Merge branch 'BerriAI:main' into main 2024-06-06 10:12:20 -07:00
Krrish Dholakia
a7dcf25722 feat(router.py): enable settting 'order' for a deployment in model list
Allows user to control which model gets called first in model group
2024-06-06 09:46:51 -07:00
Raymond Huang
f4c49755a0 fix token counter bug 2024-06-05 23:40:55 -07:00
Sha Ahammed Roze
ea67803747
Merge branch 'BerriAI:main' into main 2024-06-06 10:02:15 +05:30
Krrish Dholakia
a76a9b7d11 feat(bedrock_httpx.py): add support for bedrock converse api
closes https://github.com/BerriAI/litellm/issues/4000
2024-06-05 21:20:36 -07:00
Ishaan Jaff
4d2337ec72
Merge branch 'main' into patch-1 2024-06-05 13:35:31 -07:00
Sha Ahammed Roze
0a4abfdd1d
Merge branch 'BerriAI:main' into main 2024-06-05 21:56:41 +05:30
Krrish Dholakia
b360ab4c89 fix(azure.py): support dynamic drop params 2024-06-05 09:03:10 -07:00
Krrish Dholakia
162f9400d2 feat(utils.py): support dynamically setting 'drop_params'
Allows user to turn this on/off for individual calls by passing in as a completion arg
2024-06-05 08:44:04 -07:00
sha-ahammed
faa4dfe03e feat: Add Ollama as a provider in the proxy UI 2024-06-05 16:48:38 +05:30
Ishaan Jaff
6b57352400 fix VertexAIException APIError 2024-06-04 22:11:11 -07:00
Ishaan Jaff
d6ba50319c fix langfuse log metadata 2024-06-04 21:34:49 -07:00
Krish Dholakia
c544ba3654
Merge pull request #4009 from BerriAI/litellm_fix_streaming_cost_cal
fix(utils.py): fix cost calculation for openai-compatible streaming object
2024-06-04 21:00:22 -07:00
Krish Dholakia
d6f4233441
Merge pull request #4015 from BerriAI/litellm_stream_options_fix_2
feat(utils.py): Support `stream_options` param across all providers
2024-06-04 20:59:39 -07:00
Krrish Dholakia
1a95660495 fix(test_completion.py): fix predibase test to be mock + fix optional param mapping for predibase 2024-06-04 20:06:23 -07:00
Krrish Dholakia
43af5575c8 fix(utils.py): fix 2024-06-04 19:41:20 -07:00
Krrish Dholakia
54dacfdf61 feat(utils.py): support 'stream_options' param across all providers
Closes https://github.com/BerriAI/litellm/issues/3553
2024-06-04 19:03:26 -07:00
Krrish Dholakia
34f31a1994 fix(utils.py): add coverage for text openai and databricks 2024-06-04 18:27:03 -07:00
Krrish Dholakia
9aa29854de fix(utils.py): fix stream options to return consistent response object 2024-06-04 18:17:45 -07:00
Ishaan Jaff
3b823c7587 fix - by default log raw curl command on langfuse 2024-06-04 16:30:25 -07:00
Krrish Dholakia
52a2f5150c fix(utils.py): fix cost calculation for openai-compatible streaming object 2024-06-04 10:36:25 -07:00
Krrish Dholakia
7b474ec267 fix(utils.py): add coverage for azure img gen content policy violation error 2024-06-04 08:29:30 -07:00
Krrish Dholakia
1de5235ba0 fix(router.py): use litellm.request_timeout as default for router clients 2024-06-03 14:19:53 -07:00
Ishaan Jaff
dd7d0a2895
Merge pull request #3983 from BerriAI/litellm_log_request_boddy_langfuse
[Feat] Log Raw Request from LiteLLM on Langfuse - when `"log_raw_request": true`
2024-06-03 13:42:06 -07:00
Ishaan Jaff
857ceb40bc feat - log raw_request to langfuse / other logging providers 2024-06-03 07:53:52 -07:00
Krrish Dholakia
ea30359b38 fix(utils.py): handle else block for get optional params 2024-06-03 07:45:44 -07:00
Krrish Dholakia
9ef83126d7 fix(utils.py): correctly instrument passing through api version in optional param check 2024-06-01 19:31:52 -07:00
Krrish Dholakia
7efac4d36c fix(azure.py): support dropping 'tool_choice=required' for older azure API versions
Closes https://github.com/BerriAI/litellm/issues/3876
2024-06-01 18:44:50 -07:00
Krish Dholakia
e7ff3adc26
Merge pull request #3944 from BerriAI/litellm_fix_parallel_streaming
fix: fix streaming with httpx client
2024-05-31 21:42:37 -07:00
Krrish Dholakia
7523f803d2 fix(utils.py): support get_max_tokens() call with same model_name as completion
Closes https://github.com/BerriAI/litellm/issues/3921
2024-05-31 21:37:51 -07:00
Krrish Dholakia
93c3635b64 fix: fix streaming with httpx client
prevent overwriting streams in parallel streaming calls
2024-05-31 10:55:18 -07:00
lj
27ed72405b
Merge branch 'main' into fix-pydantic-warnings-again 2024-05-31 11:35:42 +08:00
Krish Dholakia
d3a247bf20
Merge pull request #3928 from BerriAI/litellm_audio_speech_endpoint
feat(main.py): support openai tts endpoint
2024-05-30 17:30:42 -07:00
Krrish Dholakia
d65b7fe01b fix(main.py): add logging to audio_transcription calls 2024-05-30 16:57:11 -07:00
KX
d3921a3d28 fix: add missing seed parameter to ollama input
Current ollama interfacing does not allow for seed, which is supported in https://github.com/ollama/ollama/blob/main/docs/api.md#parameters and https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values

This resolves that by adding in handling of seed parameter.
2024-05-31 01:47:56 +08:00
Nir Gazit
b8d97c688c Revert "Revert "fix: Log errors in Traceloop Integration (reverts previous revert)"" 2024-05-30 04:06:45 +03:00
Krish Dholakia
77cc9cded9
Revert "fix: Log errors in Traceloop Integration (reverts previous revert)" 2024-05-29 16:30:09 -07:00
Krish Dholakia
c76deb8f76
Merge pull request #3846 from nirga/revert-3831-revert-3780-traceloop-failures
fix: Log errors in Traceloop Integration (reverts previous revert)
2024-05-29 08:54:01 -07:00