Ishaan Jaff
|
b763174326
|
litellm mcp bridge docs
|
2025-03-21 17:00:56 -07:00 |
|
Hammad Saeed
|
732ab55cc8
|
fix(model_param_helper.py): update _get_litellm_supported_transcription_kwargs() to use proper annotations from TranscriptionCreateParamsNonStreaming & `TranscriptionCreateParamsStreaming
|
2025-03-21 16:55:35 -07:00 |
|
Ishaan Jaff
|
f07d127963
|
add diagram for litellm mcp integration
|
2025-03-21 16:54:55 -07:00 |
|
Ishaan Jaff
|
751b4bf968
|
docs litellm mcp bridge
|
2025-03-21 16:50:30 -07:00 |
|
Krrish Dholakia
|
91cf3fc40d
|
test: initial e2e testing to ensure non admin jwt token cannot create new teams
|
2025-03-21 16:40:18 -07:00 |
|
Hammad Saeed
|
3b9dd9be36
|
fix(model_param_helper.py): change TranscriptionCreateParams.__annotations__ to TranscriptionCreateParams.__dict__ to clean logging error // handle typeddict
|
2025-03-21 16:30:46 -07:00 |
|
Krrish Dholakia
|
48e6a7036b
|
test: mock sagemaker tests
|
2025-03-21 16:21:18 -07:00 |
|
Krrish Dholakia
|
58f46d847c
|
build: update pull_request_template.md
|
2025-03-21 16:20:28 -07:00 |
|
Krrish Dholakia
|
1f4cee6a57
|
test: mock sagemaker tests
|
2025-03-21 16:18:02 -07:00 |
|
Krish Dholakia
|
c73f65da98
|
Merge pull request #9437 from BerriAI/litellm_dev_03_21_2025_p1
Consistent anthropic response_format streaming/non-streaming behaviour
|
2025-03-21 16:17:27 -07:00 |
|
Krrish Dholakia
|
8265a88e0a
|
test: update tests
|
2025-03-21 15:10:30 -07:00 |
|
Krrish Dholakia
|
c7b17495a1
|
test: add unit testing
|
2025-03-21 15:01:19 -07:00 |
|
Krrish Dholakia
|
dfea55a1e7
|
fix(in_memory_cache.py): add max value limits to in-memory cache. Prevents OOM errors in prod
|
2025-03-21 14:51:12 -07:00 |
|
Ishaan Jaff
|
b8b7e5e6cf
|
clean up
|
2025-03-21 14:39:05 -07:00 |
|
Ishaan Jaff
|
147787b9e0
|
call_openai_tool on MCP client
|
2025-03-21 14:36:32 -07:00 |
|
Krrish Dholakia
|
f68e27a0c8
|
docs(image_handling.md): architecture doc on image handling on the proxy
|
2025-03-21 14:19:05 -07:00 |
|
Krish Dholakia
|
9aef9fdca8
|
Merge pull request #9438 from BerriAI/litellm_make_cooldown_more_descriptive
fix(handle_error.py): make cooldown error more descriptive
|
2025-03-21 12:23:00 -07:00 |
|
Krrish Dholakia
|
86be28b640
|
fix: fix linting error
|
2025-03-21 12:20:21 -07:00 |
|
Ishaan Jaff
|
1f3aa82095
|
fix beta caps
|
2025-03-21 10:53:10 -07:00 |
|
Krish Dholakia
|
ea1b282512
|
Merge pull request #9439 from BerriAI/litellm_dev_03_20_2025_p2
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 20s
support returning api-base on pass-through endpoints + consistently return 404 if team not found in DB
|
2025-03-21 10:52:36 -07:00 |
|
Krrish Dholakia
|
a1b716c1ef
|
test: fix test - handle llm api inconsistency
|
2025-03-21 10:51:34 -07:00 |
|
Ishaan Jaff
|
bbf1962540
|
fix llm responses
|
2025-03-21 10:50:55 -07:00 |
|
Ishaan Jaff
|
1a56bb5bdd
|
transform_mcp_tool_to_openai_tool
|
2025-03-21 10:49:06 -07:00 |
|
Krrish Dholakia
|
8cc3c9204d
|
fix(handle_error.py): make cooldown error more descriptive
|
2025-03-21 10:46:35 -07:00 |
|
Krrish Dholakia
|
81a1494a51
|
test: add unit testing
|
2025-03-21 10:35:36 -07:00 |
|
Ishaan Jaff
|
d3279d114e
|
litellm MCP client 1
|
2025-03-21 10:32:51 -07:00 |
|
Ishaan Jaff
|
d61febc053
|
change location of MCP client
|
2025-03-21 10:30:57 -07:00 |
|
Ishaan Jaff
|
3a48f9eee5
|
basic MCP client structure
|
2025-03-21 10:25:22 -07:00 |
|
Krrish Dholakia
|
e7ef14398f
|
fix(anthropic/chat/transformation.py): correctly update response_format to tool call transformation
Fixes https://github.com/BerriAI/litellm/issues/9411
|
2025-03-21 10:20:21 -07:00 |
|
Ishaan Jaff
|
177e72334c
|
simple MCP interface
|
2025-03-21 10:11:06 -07:00 |
|
Ishaan Jaff
|
b794e78288
|
Merge pull request #9435 from BerriAI/litellm_mcp_client_interface
[Docs - Draft] LiteLLM x MCP Interface
|
2025-03-21 09:38:38 -07:00 |
|
Ishaan Jaff
|
a2c9fd16ae
|
docs litellm mcp
|
2025-03-21 09:37:04 -07:00 |
|
Ishaan Jaff
|
604c2b80eb
|
docs litellm mcp
|
2025-03-21 09:34:38 -07:00 |
|
Ishaan Jaff
|
c5614d4389
|
docs add central platform team control on MCP
|
2025-03-21 09:33:09 -07:00 |
|
Ishaan Jaff
|
145d2d8dd6
|
doc litellm MCP client
|
2025-03-21 09:26:03 -07:00 |
|
Ishaan Jaff
|
5bc07b0c5d
|
test tool registry
|
2025-03-20 22:03:56 -07:00 |
|
Ishaan Jaff
|
fd95499e15
|
fix naming
|
2025-03-20 21:57:28 -07:00 |
|
Ishaan Jaff
|
90877d2c1c
|
bump requirements.txt
|
2025-03-20 21:56:10 -07:00 |
|
Ishaan Jaff
|
c44fe8bd90
|
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 21s
[Feat] OpenAI o1-pro Responses API streaming support
|
2025-03-20 21:54:43 -07:00 |
|
Ishaan Jaff
|
2cd170daeb
|
add mcp==1.4.1
|
2025-03-20 21:50:11 -07:00 |
|
Ishaan Jaff
|
435fba929e
|
litellm mcp support
|
2025-03-20 21:48:55 -07:00 |
|
Ishaan Jaff
|
9173685ec2
|
docs mcp tool spec
|
2025-03-20 21:40:47 -07:00 |
|
Ishaan Jaff
|
7a07f2e305
|
litellm MCP example
|
2025-03-20 21:36:28 -07:00 |
|
Ishaan Jaff
|
8089c5e3c0
|
litellm MCP
|
2025-03-20 21:35:06 -07:00 |
|
Ishaan Jaff
|
7826c9bd21
|
add litellm mcp endpoints
|
2025-03-20 21:12:56 -07:00 |
|
Krrish Dholakia
|
0e0925fccf
|
fix(team_endpoints.py): consistently return 404 if team not found in DB
Fixes response on /team/delete
|
2025-03-20 20:28:03 -07:00 |
|
Krrish Dholakia
|
532af66bbd
|
feat(pass_through_endpoints.py): return api base on pass-through exception
enables easy debugging on backend api errors
|
2025-03-20 20:19:52 -07:00 |
|
Krrish Dholakia
|
943e036851
|
feat(pass_through_endpoints.py): support returning api-base on pass-through endpoints
Make it easier to debug what the api base sent to provider was
|
2025-03-20 20:11:49 -07:00 |
|
Ishaan Jaff
|
2afa73d584
|
fix mcp router
|
2025-03-20 18:05:14 -07:00 |
|
Ishaan Jaff
|
4dec84f883
|
cleanup
|
2025-03-20 18:02:02 -07:00 |
|