Commit graph

13391 commits

Author SHA1 Message Date
Krrish Dholakia
92d4486a2c fix(llm_passthrough_endpoints.py): raise verbose error if credentials not found on proxy 2025-03-22 11:49:51 -07:00
Ishaan Jaff
792a2d6115 test_is_chunk_non_empty_with_annotations 2025-03-22 11:41:53 -07:00
Ishaan Jaff
44f4c623e2 Add annotations to the delta 2025-03-22 11:38:30 -07:00
Krrish Dholakia
06e69a414e fix(vertex_ai/common_utils.py): fix handling constructed url with default vertex config 2025-03-22 11:32:01 -07:00
Krrish Dholakia
b44b3bd36b feat(llm_passthrough_endpoints.py): base case passing for refactored vertex passthrough route 2025-03-22 11:06:52 -07:00
Ishaan Jaff
dc5bf10009 feat - add openai web search 2025-03-22 10:43:35 -07:00
Krrish Dholakia
94d3413335 refactor(llm_passthrough_endpoints.py): refactor vertex passthrough to use common llm passthrough handler.py 2025-03-22 10:42:46 -07:00
Ishaan Jaff
4683b8ea70 add search_context_cost_per_1k_calls 2025-03-22 10:24:08 -07:00
Krish Dholakia
950edd76b3
Merge pull request #9454 from BerriAI/litellm_dev_03_21_2025_p3
All checks were successful
Helm unit test / unit-test (push) Successful in 20s
Read Version from pyproject.toml / read-version (push) Successful in 39s
Fix route check for non-proxy admins on jwt auth
2025-03-21 22:32:46 -07:00
Ishaan Jaff
9bc5aaee64 fix ModelParamHelper 2025-03-21 21:35:49 -07:00
Ishaan Jaff
23b86ccd9f
Merge branch 'main' into litellm_contributor_fix_mar_21 2025-03-21 21:07:20 -07:00
Ishaan Jaff
075f3537f6 bump version 2025-03-21 21:03:42 -07:00
Ishaan Jaff
7e1cd7cbeb bug fix azure/gpt-4.5-preview was added as litellm_provider=openai, should be azure 2025-03-21 20:50:13 -07:00
Ishaan Jaff
ed74b419a3
Merge pull request #9436 from BerriAI/litellm_mcp_interface
[Feat] LiteLLM x MCP Bridge - Use MCP Tools with LiteLLM
2025-03-21 20:42:16 -07:00
Ishaan Jaff
c63217e4d4 _get_litellm_supported_transcription_kwargs 2025-03-21 19:20:13 -07:00
Ishaan Jaff
478d209a0c
Merge pull request #9451 from hsaeed3/correct-transcription-params
fix(model_param_helper.py): update `_get_litellm_supported_transcription_kwargs()` to use proper annotations from `TranscriptionCreateParamsNonStreaming` & ``TranscriptionCreateParamsStreaming`
2025-03-21 19:10:21 -07:00
Krrish Dholakia
f0d047a0e4 fix: fix linting error 2025-03-21 18:23:58 -07:00
Ishaan Jaff
8d770f0ccf fix mcp client 2025-03-21 18:18:23 -07:00
Krrish Dholakia
8b0c75d88f fix: fix linting error 2025-03-21 18:08:00 -07:00
Krish Dholakia
dfb41c927e
Merge pull request #9448 from BerriAI/litellm_dev_03_21_2025_p2
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 15s
Helm unit test / unit-test (push) Successful in 19s
Set max size limit to in-memory cache item - prevents OOM errors
2025-03-21 17:51:46 -07:00
Krrish Dholakia
2d1082f520 fix(proxy/_types.py): handle user_email=None 2025-03-21 17:46:18 -07:00
Krrish Dholakia
6034187aa5 fix: remove unused import 2025-03-21 17:25:57 -07:00
Krrish Dholakia
4b49dab717 fix: remove unused import 2025-03-21 17:22:58 -07:00
Ishaan Jaff
6fb2ae8731 docs mcp docs update 2025-03-21 17:21:40 -07:00
Krrish Dholakia
95ef5f1009 refactor(user_api_key_auth.py): move is_route_allowed to inside common_checks
ensures consistent behaviour inside api key + jwt routes
2025-03-21 17:21:07 -07:00
Ishaan Jaff
11d3ab5003 add experimental mcp client 2025-03-21 17:07:15 -07:00
Hammad Saeed
732ab55cc8 fix(model_param_helper.py): update _get_litellm_supported_transcription_kwargs() to use proper annotations from TranscriptionCreateParamsNonStreaming & `TranscriptionCreateParamsStreaming 2025-03-21 16:55:35 -07:00
Krrish Dholakia
91cf3fc40d test: initial e2e testing to ensure non admin jwt token cannot create new teams 2025-03-21 16:40:18 -07:00
Hammad Saeed
3b9dd9be36 fix(model_param_helper.py): change TranscriptionCreateParams.__annotations__ to TranscriptionCreateParams.__dict__ to clean logging error // handle typeddict 2025-03-21 16:30:46 -07:00
Krrish Dholakia
48e6a7036b test: mock sagemaker tests 2025-03-21 16:21:18 -07:00
Krrish Dholakia
1f4cee6a57 test: mock sagemaker tests 2025-03-21 16:18:02 -07:00
Krish Dholakia
c73f65da98
Merge pull request #9437 from BerriAI/litellm_dev_03_21_2025_p1
Consistent anthropic response_format streaming/non-streaming behaviour
2025-03-21 16:17:27 -07:00
Krrish Dholakia
c7b17495a1 test: add unit testing 2025-03-21 15:01:19 -07:00
Krrish Dholakia
dfea55a1e7 fix(in_memory_cache.py): add max value limits to in-memory cache. Prevents OOM errors in prod 2025-03-21 14:51:12 -07:00
Ishaan Jaff
147787b9e0 call_openai_tool on MCP client 2025-03-21 14:36:32 -07:00
Krish Dholakia
9aef9fdca8
Merge pull request #9438 from BerriAI/litellm_make_cooldown_more_descriptive
fix(handle_error.py): make cooldown error more descriptive
2025-03-21 12:23:00 -07:00
Krrish Dholakia
86be28b640 fix: fix linting error 2025-03-21 12:20:21 -07:00
Krish Dholakia
ea1b282512
Merge pull request #9439 from BerriAI/litellm_dev_03_20_2025_p2
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 20s
support returning api-base on pass-through endpoints +  consistently return 404 if team not found in DB
2025-03-21 10:52:36 -07:00
Ishaan Jaff
1a56bb5bdd transform_mcp_tool_to_openai_tool 2025-03-21 10:49:06 -07:00
Krrish Dholakia
8cc3c9204d fix(handle_error.py): make cooldown error more descriptive 2025-03-21 10:46:35 -07:00
Ishaan Jaff
d61febc053 change location of MCP client 2025-03-21 10:30:57 -07:00
Ishaan Jaff
3a48f9eee5 basic MCP client structure 2025-03-21 10:25:22 -07:00
Krrish Dholakia
e7ef14398f fix(anthropic/chat/transformation.py): correctly update response_format to tool call transformation
Fixes https://github.com/BerriAI/litellm/issues/9411
2025-03-21 10:20:21 -07:00
Ishaan Jaff
fd95499e15 fix naming 2025-03-20 21:57:28 -07:00
Ishaan Jaff
c44fe8bd90
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 21s
[Feat] OpenAI o1-pro Responses API streaming support
2025-03-20 21:54:43 -07:00
Ishaan Jaff
7826c9bd21 add litellm mcp endpoints 2025-03-20 21:12:56 -07:00
Krrish Dholakia
0e0925fccf fix(team_endpoints.py): consistently return 404 if team not found in DB
Fixes response on /team/delete
2025-03-20 20:28:03 -07:00
Krrish Dholakia
532af66bbd feat(pass_through_endpoints.py): return api base on pass-through exception
enables easy debugging on backend api errors
2025-03-20 20:19:52 -07:00
Krrish Dholakia
943e036851 feat(pass_through_endpoints.py): support returning api-base on pass-through endpoints
Make it easier to debug what the api base sent to provider was
2025-03-20 20:11:49 -07:00
Ishaan Jaff
2afa73d584 fix mcp router 2025-03-20 18:05:14 -07:00