Krrish Dholakia
|
86be28b640
|
fix: fix linting error
|
2025-03-21 12:20:21 -07:00 |
|
Krish Dholakia
|
ea1b282512
|
Merge pull request #9439 from BerriAI/litellm_dev_03_20_2025_p2
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 20s
support returning api-base on pass-through endpoints + consistently return 404 if team not found in DB
|
2025-03-21 10:52:36 -07:00 |
|
Ishaan Jaff
|
1a56bb5bdd
|
transform_mcp_tool_to_openai_tool
|
2025-03-21 10:49:06 -07:00 |
|
Krrish Dholakia
|
8cc3c9204d
|
fix(handle_error.py): make cooldown error more descriptive
|
2025-03-21 10:46:35 -07:00 |
|
Ishaan Jaff
|
d61febc053
|
change location of MCP client
|
2025-03-21 10:30:57 -07:00 |
|
Ishaan Jaff
|
3a48f9eee5
|
basic MCP client structure
|
2025-03-21 10:25:22 -07:00 |
|
Krrish Dholakia
|
e7ef14398f
|
fix(anthropic/chat/transformation.py): correctly update response_format to tool call transformation
Fixes https://github.com/BerriAI/litellm/issues/9411
|
2025-03-21 10:20:21 -07:00 |
|
Ishaan Jaff
|
fd95499e15
|
fix naming
|
2025-03-20 21:57:28 -07:00 |
|
Ishaan Jaff
|
c44fe8bd90
|
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 21s
[Feat] OpenAI o1-pro Responses API streaming support
|
2025-03-20 21:54:43 -07:00 |
|
Ishaan Jaff
|
7826c9bd21
|
add litellm mcp endpoints
|
2025-03-20 21:12:56 -07:00 |
|
Krrish Dholakia
|
0e0925fccf
|
fix(team_endpoints.py): consistently return 404 if team not found in DB
Fixes response on /team/delete
|
2025-03-20 20:28:03 -07:00 |
|
Krrish Dholakia
|
532af66bbd
|
feat(pass_through_endpoints.py): return api base on pass-through exception
enables easy debugging on backend api errors
|
2025-03-20 20:19:52 -07:00 |
|
Krrish Dholakia
|
943e036851
|
feat(pass_through_endpoints.py): support returning api-base on pass-through endpoints
Make it easier to debug what the api base sent to provider was
|
2025-03-20 20:11:49 -07:00 |
|
Ishaan Jaff
|
2afa73d584
|
fix mcp router
|
2025-03-20 18:05:14 -07:00 |
|
Ishaan Jaff
|
4dec84f883
|
cleanup
|
2025-03-20 18:02:02 -07:00 |
|
Ishaan Jaff
|
08d6ea37ef
|
remove stale file
|
2025-03-20 17:59:19 -07:00 |
|
Ishaan Jaff
|
8909e24efc
|
add mcp routes
|
2025-03-20 17:58:33 -07:00 |
|
Ishaan Jaff
|
f44a5a86db
|
litellm mcp routes
|
2025-03-20 17:55:43 -07:00 |
|
Ishaan Jaff
|
dc33cfa074
|
init global_mcp_tool_registry
|
2025-03-20 17:53:37 -07:00 |
|
Ishaan Jaff
|
45ee1182ae
|
example mcp tools
|
2025-03-20 17:53:20 -07:00 |
|
Ishaan Jaff
|
89cc404a57
|
add mcp tool registry
|
2025-03-20 17:51:03 -07:00 |
|
Ishaan Jaff
|
3dcba108ca
|
load load_tools_from_config
|
2025-03-20 17:36:17 -07:00 |
|
Ishaan Jaff
|
5377daeede
|
add MCPToolRegistry
|
2025-03-20 17:22:12 -07:00 |
|
Ishaan Jaff
|
9f22cf729a
|
mock config for MCP tools
|
2025-03-20 17:16:55 -07:00 |
|
Ishaan Jaff
|
93836fa84a
|
sample mcp server
|
2025-03-20 15:37:24 -07:00 |
|
Krish Dholakia
|
ab385848c1
|
Merge pull request #9260 from Grizzly-jobs/fix/voyage-ai-token-usage-tracking
fix: VoyageAI `prompt_token` always empty
|
2025-03-20 14:00:51 -07:00 |
|
Ishaan Jaff
|
1829cc2042
|
fix code quality checks
|
2025-03-20 13:57:35 -07:00 |
|
Krish Dholakia
|
706bcf4432
|
Merge pull request #9366 from JamesGuthrie/jg/vertex-output-dimensionality
fix: VertexAI outputDimensionality configuration
|
2025-03-20 13:55:33 -07:00 |
|
Ishaan Jaff
|
4b4a0b2612
|
supports_native_streaming
|
2025-03-20 13:52:30 -07:00 |
|
Krrish Dholakia
|
76c3957140
|
fix(main.py): fix OR import
|
2025-03-20 13:52:28 -07:00 |
|
Krish Dholakia
|
cb4155fb16
|
Merge pull request #9369 from graysonchen/feature/add_openrouter_api_base
feat: Add support for custom OPENROUTER_API_BASE via get_secret in co…
|
2025-03-20 13:52:03 -07:00 |
|
Krish Dholakia
|
0b399c391c
|
Merge pull request #9402 from BerriAI/litellm_openai_file_types
fix(types/utils.py): support openai 'file' message type
|
2025-03-20 13:46:44 -07:00 |
|
Ishaan Jaff
|
0352559c66
|
supports_native_streaming
|
2025-03-20 13:34:57 -07:00 |
|
Ishaan Jaff
|
a29587e178
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:30:09 -07:00 |
|
Ishaan Jaff
|
55115bf520
|
transform_responses_api_request
|
2025-03-20 12:28:55 -07:00 |
|
Ishaan Jaff
|
af8e917eb0
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:25:58 -07:00 |
|
Ishaan Jaff
|
c64f834272
|
add AsyncMockResponsesAPIStreamingIterator
|
2025-03-20 12:06:36 -07:00 |
|
Ishaan Jaff
|
0cd671785d
|
add fake_stream to llm http handler
|
2025-03-20 09:55:59 -07:00 |
|
Ishaan Jaff
|
bc174adcd0
|
add should_fake_stream
|
2025-03-20 09:54:26 -07:00 |
|
Krrish Dholakia
|
8ef9129556
|
fix(types/utils.py): support openai 'file' message type
Closes https://github.com/BerriAI/litellm/issues/9365
|
2025-03-19 23:13:51 -07:00 |
|
Krish Dholakia
|
7cf7e80c5a
|
Merge pull request #9395 from BerriAI/litellm_ui_fixes_03_19_2025
Read Version from pyproject.toml / read-version (push) Successful in 18s
Helm unit test / unit-test (push) Successful in 21s
fix(internal_user_endpoints.py): re-introduce upsert on user not found
|
2025-03-19 22:58:32 -07:00 |
|
Krrish Dholakia
|
fe24b9d90b
|
feat(azure/gpt_transformation.py): add azure audio model support
Closes https://github.com/BerriAI/litellm/issues/6305
|
2025-03-19 22:57:49 -07:00 |
|
Ishaan Jaff
|
9203910ab6
|
fix import hashlib
|
2025-03-19 21:08:19 -07:00 |
|
Ishaan Jaff
|
1bd7443c25
|
Merge pull request #9384 from BerriAI/litellm_prompt_management_custom
[Feat] - Allow building custom prompt management integration
|
2025-03-19 21:06:41 -07:00 |
|
Ishaan Jaff
|
247e4d09ee
|
Merge branch 'main' into litellm_fix_ssl_verify
|
2025-03-19 21:03:06 -07:00 |
|
Ishaan Jaff
|
dae5c24b55
|
get_chat_completion_prompt
|
2025-03-19 20:50:15 -07:00 |
|
Krrish Dholakia
|
056d34742f
|
build(model_prices_and_context_window.json): fix native streaming flag
|
2025-03-19 19:53:19 -07:00 |
|
Krrish Dholakia
|
78781a0a55
|
build(model_prices_and_context_window.json): add o1-pro pricing
|
2025-03-19 19:35:35 -07:00 |
|
Krrish Dholakia
|
07b1e6aa1c
|
fix(internal_user_endpoints.py): re-introduce upsert on user not found
Fixes https://github.com/BerriAI/litellm/issues/9243
|
2025-03-19 19:28:11 -07:00 |
|
Krish Dholakia
|
0de015b4c5
|
Revert "Fix latency redis"
|
2025-03-19 18:11:22 -07:00 |
|