Commit graph

13418 commits

Author SHA1 Message Date
Ishaan Jaff
6fb2ae8731 docs mcp docs update 2025-03-21 17:21:40 -07:00
Krrish Dholakia
95ef5f1009 refactor(user_api_key_auth.py): move is_route_allowed to inside common_checks
ensures consistent behaviour inside api key + jwt routes
2025-03-21 17:21:07 -07:00
Ishaan Jaff
11d3ab5003 add experimental mcp client 2025-03-21 17:07:15 -07:00
Hammad Saeed
732ab55cc8 fix(model_param_helper.py): update _get_litellm_supported_transcription_kwargs() to use proper annotations from TranscriptionCreateParamsNonStreaming & `TranscriptionCreateParamsStreaming 2025-03-21 16:55:35 -07:00
Krrish Dholakia
91cf3fc40d test: initial e2e testing to ensure non admin jwt token cannot create new teams 2025-03-21 16:40:18 -07:00
Hammad Saeed
3b9dd9be36 fix(model_param_helper.py): change TranscriptionCreateParams.__annotations__ to TranscriptionCreateParams.__dict__ to clean logging error // handle typeddict 2025-03-21 16:30:46 -07:00
Krrish Dholakia
48e6a7036b test: mock sagemaker tests 2025-03-21 16:21:18 -07:00
Krrish Dholakia
1f4cee6a57 test: mock sagemaker tests 2025-03-21 16:18:02 -07:00
Krish Dholakia
c73f65da98
Merge pull request #9437 from BerriAI/litellm_dev_03_21_2025_p1
Consistent anthropic response_format streaming/non-streaming behaviour
2025-03-21 16:17:27 -07:00
Krrish Dholakia
c7b17495a1 test: add unit testing 2025-03-21 15:01:19 -07:00
Krrish Dholakia
dfea55a1e7 fix(in_memory_cache.py): add max value limits to in-memory cache. Prevents OOM errors in prod 2025-03-21 14:51:12 -07:00
Ishaan Jaff
147787b9e0 call_openai_tool on MCP client 2025-03-21 14:36:32 -07:00
Krish Dholakia
9aef9fdca8
Merge pull request #9438 from BerriAI/litellm_make_cooldown_more_descriptive
fix(handle_error.py): make cooldown error more descriptive
2025-03-21 12:23:00 -07:00
Krrish Dholakia
86be28b640 fix: fix linting error 2025-03-21 12:20:21 -07:00
Krish Dholakia
ea1b282512
Merge pull request #9439 from BerriAI/litellm_dev_03_20_2025_p2
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 20s
support returning api-base on pass-through endpoints +  consistently return 404 if team not found in DB
2025-03-21 10:52:36 -07:00
Ishaan Jaff
1a56bb5bdd transform_mcp_tool_to_openai_tool 2025-03-21 10:49:06 -07:00
Krrish Dholakia
8cc3c9204d fix(handle_error.py): make cooldown error more descriptive 2025-03-21 10:46:35 -07:00
Ishaan Jaff
d61febc053 change location of MCP client 2025-03-21 10:30:57 -07:00
Ishaan Jaff
3a48f9eee5 basic MCP client structure 2025-03-21 10:25:22 -07:00
Krrish Dholakia
e7ef14398f fix(anthropic/chat/transformation.py): correctly update response_format to tool call transformation
Fixes https://github.com/BerriAI/litellm/issues/9411
2025-03-21 10:20:21 -07:00
Ishaan Jaff
fd95499e15 fix naming 2025-03-20 21:57:28 -07:00
Ishaan Jaff
c44fe8bd90
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 21s
[Feat] OpenAI o1-pro Responses API streaming support
2025-03-20 21:54:43 -07:00
Ishaan Jaff
7826c9bd21 add litellm mcp endpoints 2025-03-20 21:12:56 -07:00
Krrish Dholakia
0e0925fccf fix(team_endpoints.py): consistently return 404 if team not found in DB
Fixes response on /team/delete
2025-03-20 20:28:03 -07:00
Krrish Dholakia
532af66bbd feat(pass_through_endpoints.py): return api base on pass-through exception
enables easy debugging on backend api errors
2025-03-20 20:19:52 -07:00
Krrish Dholakia
943e036851 feat(pass_through_endpoints.py): support returning api-base on pass-through endpoints
Make it easier to debug what the api base sent to provider was
2025-03-20 20:11:49 -07:00
Ishaan Jaff
2afa73d584 fix mcp router 2025-03-20 18:05:14 -07:00
Ishaan Jaff
4dec84f883 cleanup 2025-03-20 18:02:02 -07:00
Ishaan Jaff
08d6ea37ef remove stale file 2025-03-20 17:59:19 -07:00
Ishaan Jaff
8909e24efc add mcp routes 2025-03-20 17:58:33 -07:00
Ishaan Jaff
f44a5a86db litellm mcp routes 2025-03-20 17:55:43 -07:00
Ishaan Jaff
dc33cfa074 init global_mcp_tool_registry 2025-03-20 17:53:37 -07:00
Ishaan Jaff
45ee1182ae example mcp tools 2025-03-20 17:53:20 -07:00
Ishaan Jaff
89cc404a57 add mcp tool registry 2025-03-20 17:51:03 -07:00
Ishaan Jaff
3dcba108ca load load_tools_from_config 2025-03-20 17:36:17 -07:00
Ishaan Jaff
5377daeede add MCPToolRegistry 2025-03-20 17:22:12 -07:00
Ishaan Jaff
9f22cf729a mock config for MCP tools 2025-03-20 17:16:55 -07:00
Ishaan Jaff
93836fa84a sample mcp server 2025-03-20 15:37:24 -07:00
Krish Dholakia
ab385848c1
Merge pull request #9260 from Grizzly-jobs/fix/voyage-ai-token-usage-tracking
fix: VoyageAI `prompt_token` always empty
2025-03-20 14:00:51 -07:00
Ishaan Jaff
1829cc2042 fix code quality checks 2025-03-20 13:57:35 -07:00
Krish Dholakia
706bcf4432
Merge pull request #9366 from JamesGuthrie/jg/vertex-output-dimensionality
fix: VertexAI outputDimensionality configuration
2025-03-20 13:55:33 -07:00
Ishaan Jaff
4b4a0b2612 supports_native_streaming 2025-03-20 13:52:30 -07:00
Krrish Dholakia
76c3957140 fix(main.py): fix OR import 2025-03-20 13:52:28 -07:00
Krish Dholakia
cb4155fb16
Merge pull request #9369 from graysonchen/feature/add_openrouter_api_base
feat: Add support for custom OPENROUTER_API_BASE via get_secret in co…
2025-03-20 13:52:03 -07:00
Krish Dholakia
0b399c391c
Merge pull request #9402 from BerriAI/litellm_openai_file_types
fix(types/utils.py): support openai 'file' message type
2025-03-20 13:46:44 -07:00
Ishaan Jaff
0352559c66 supports_native_streaming 2025-03-20 13:34:57 -07:00
Ishaan Jaff
a29587e178 MockResponsesAPIStreamingIterator 2025-03-20 12:30:09 -07:00
Ishaan Jaff
55115bf520 transform_responses_api_request 2025-03-20 12:28:55 -07:00
Ishaan Jaff
af8e917eb0 MockResponsesAPIStreamingIterator 2025-03-20 12:25:58 -07:00
Ishaan Jaff
c64f834272 add AsyncMockResponsesAPIStreamingIterator 2025-03-20 12:06:36 -07:00