Ishaan Jaff
|
a8cf71973d
|
docs mcp docs update
|
2025-03-21 17:21:40 -07:00 |
|
Ishaan Jaff
|
b006e325cc
|
add experimental mcp client
|
2025-03-21 17:07:15 -07:00 |
|
Ishaan Jaff
|
0b021b8334
|
call_openai_tool on MCP client
|
2025-03-21 14:36:32 -07:00 |
|
Ishaan Jaff
|
cd6055cfb0
|
transform_mcp_tool_to_openai_tool
|
2025-03-21 10:49:06 -07:00 |
|
Ishaan Jaff
|
039129676c
|
change location of MCP client
|
2025-03-21 10:30:57 -07:00 |
|
Ishaan Jaff
|
50ec2bd5c9
|
basic MCP client structure
|
2025-03-21 10:25:22 -07:00 |
|
Ishaan Jaff
|
6feb83eb51
|
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
[Feat] OpenAI o1-pro Responses API streaming support
|
2025-03-20 21:54:43 -07:00 |
|
Krish Dholakia
|
bc03378fef
|
Merge pull request #9260 from Grizzly-jobs/fix/voyage-ai-token-usage-tracking
fix: VoyageAI `prompt_token` always empty
|
2025-03-20 14:00:51 -07:00 |
|
Ishaan Jaff
|
3088204ac2
|
fix code quality checks
|
2025-03-20 13:57:35 -07:00 |
|
Krish Dholakia
|
f5f92bf6ae
|
Merge pull request #9366 from JamesGuthrie/jg/vertex-output-dimensionality
fix: VertexAI outputDimensionality configuration
|
2025-03-20 13:55:33 -07:00 |
|
Ishaan Jaff
|
360931ef1b
|
supports_native_streaming
|
2025-03-20 13:52:30 -07:00 |
|
Krrish Dholakia
|
803f723fb9
|
fix(main.py): fix OR import
|
2025-03-20 13:52:28 -07:00 |
|
Krish Dholakia
|
dd135099cc
|
Merge pull request #9369 from graysonchen/feature/add_openrouter_api_base
feat: Add support for custom OPENROUTER_API_BASE via get_secret in co…
|
2025-03-20 13:52:03 -07:00 |
|
Krish Dholakia
|
97ade8ee75
|
Merge pull request #9402 from BerriAI/litellm_openai_file_types
fix(types/utils.py): support openai 'file' message type
|
2025-03-20 13:46:44 -07:00 |
|
Ishaan Jaff
|
4be40d1ba0
|
supports_native_streaming
|
2025-03-20 13:34:57 -07:00 |
|
Ishaan Jaff
|
6d4cf6581d
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:30:09 -07:00 |
|
Ishaan Jaff
|
435a89dd79
|
transform_responses_api_request
|
2025-03-20 12:28:55 -07:00 |
|
Ishaan Jaff
|
830c51805c
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:25:58 -07:00 |
|
Ishaan Jaff
|
69c70ad80a
|
add AsyncMockResponsesAPIStreamingIterator
|
2025-03-20 12:06:36 -07:00 |
|
Ishaan Jaff
|
6608770e64
|
add fake_stream to llm http handler
|
2025-03-20 09:55:59 -07:00 |
|
Ishaan Jaff
|
1567e52185
|
add should_fake_stream
|
2025-03-20 09:54:26 -07:00 |
|
Krrish Dholakia
|
dcbe5df98b
|
fix(types/utils.py): support openai 'file' message type
Closes https://github.com/BerriAI/litellm/issues/9365
|
2025-03-19 23:13:51 -07:00 |
|
Krish Dholakia
|
ffd7e86449
|
Merge pull request #9395 from BerriAI/litellm_ui_fixes_03_19_2025
fix(internal_user_endpoints.py): re-introduce upsert on user not found
|
2025-03-19 22:58:32 -07:00 |
|
Krrish Dholakia
|
b228456b67
|
feat(azure/gpt_transformation.py): add azure audio model support
Closes https://github.com/BerriAI/litellm/issues/6305
|
2025-03-19 22:57:49 -07:00 |
|
Ishaan Jaff
|
08cb68c8fb
|
fix import hashlib
|
2025-03-19 21:08:19 -07:00 |
|
Ishaan Jaff
|
59b308a8f0
|
Merge pull request #9384 from BerriAI/litellm_prompt_management_custom
[Feat] - Allow building custom prompt management integration
|
2025-03-19 21:06:41 -07:00 |
|
Ishaan Jaff
|
c15e38a148
|
Merge branch 'main' into litellm_fix_ssl_verify
|
2025-03-19 21:03:06 -07:00 |
|
Ishaan Jaff
|
66b9a4a686
|
get_chat_completion_prompt
|
2025-03-19 20:50:15 -07:00 |
|
Krrish Dholakia
|
9b9c24d80b
|
build(model_prices_and_context_window.json): fix native streaming flag
|
2025-03-19 19:53:19 -07:00 |
|
Krrish Dholakia
|
17fe6784b5
|
build(model_prices_and_context_window.json): add o1-pro pricing
|
2025-03-19 19:35:35 -07:00 |
|
Krrish Dholakia
|
70addc3e14
|
fix(internal_user_endpoints.py): re-introduce upsert on user not found
Fixes https://github.com/BerriAI/litellm/issues/9243
|
2025-03-19 19:28:11 -07:00 |
|
Krish Dholakia
|
2ec1ea9c31
|
Revert "Fix latency redis"
|
2025-03-19 18:11:22 -07:00 |
|
Krish Dholakia
|
396ae38b9a
|
Merge pull request #9387 from emerzon/fix_latency_redis
Fix latency redis
|
2025-03-19 18:11:18 -07:00 |
|
Emerson Gomes
|
fc31e20e04
|
Handle empty valid_deployments in LowestLatencyLoggingHandler
|
2025-03-19 19:56:57 -05:00 |
|
Ishaan Jaff
|
4db3ca70e4
|
get_custom_logger_for_prompt_management
|
2025-03-19 17:38:24 -07:00 |
|
Ishaan Jaff
|
40e1ccf339
|
CustomPromptManagement
|
2025-03-19 16:38:59 -07:00 |
|
Krish Dholakia
|
5cf8ca4374
|
Merge pull request #9382 from BerriAI/litellm_dev_03_19_2025_p2
Support 'prisma migrate' for db schema changes
|
2025-03-19 16:37:47 -07:00 |
|
Ishaan Jaff
|
98c5ffdb20
|
get_custom_loggers_for_type
|
2025-03-19 16:36:41 -07:00 |
|
Ishaan Jaff
|
c3e33495b5
|
example X42PromptManagement
|
2025-03-19 16:24:41 -07:00 |
|
Ishaan Jaff
|
803e643f64
|
define CustomPromptManagement
|
2025-03-19 16:22:23 -07:00 |
|
Krish Dholakia
|
445ee6d2aa
|
Merge pull request #9357 from BerriAI/litellm_dev_03_18_2025_p2
fix(lowest_tpm_rpm_v2.py): support batch writing increments to redis
|
2025-03-19 15:45:10 -07:00 |
|
Krrish Dholakia
|
d914cc1575
|
fix: remove unused import
|
2025-03-19 15:33:04 -07:00 |
|
Krrish Dholakia
|
a2d3d8017e
|
feat(prisma_client.py): initial commit add prisma migration support to proxy
|
2025-03-19 14:26:59 -07:00 |
|
Krrish Dholakia
|
db8a459c89
|
fix(proxy_server.py): move db credential check to only run behind feature flag
|
2025-03-19 11:35:14 -07:00 |
|
Grayson Chen
|
b186210cea
|
feat: Add support for custom OPENROUTER_API_BASE via get_secret in completion function
|
2025-03-19 21:09:03 +08:00 |
|
James Guthrie
|
4044bca614
|
fix: VertexAI outputDimensionality configuration
VertexAI's API documentation [1] is an absolute mess. In it, they
describe the parameter to configure output dimensionality as
`output_dimensionality`. In the API example, they switch to using snake
case `outputDimensionality`, which is the correct variant.
[1]: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/text-embeddings-api#generative-ai-get-text-embedding-drest
|
2025-03-19 11:07:36 +01:00 |
|
Krish Dholakia
|
887648a364
|
Merge pull request #9363 from BerriAI/litellm_dev_03_18_2025_p3
fix(common_utils.py): handle cris only model
|
2025-03-18 23:36:12 -07:00 |
|
Krrish Dholakia
|
db3a65d52a
|
fix(common_utils.py): handle cris only model
Fixes https://github.com/BerriAI/litellm/issues/9161#issuecomment-2734905153
|
2025-03-18 23:35:43 -07:00 |
|
Ishaan Jaff
|
597afb73a6
|
Merge pull request #9353 from BerriAI/litellm_arize_dynamic_logging
[Feat] - API - Allow using dynamic Arize AI Spaces on LiteLLM
|
2025-03-18 23:35:28 -07:00 |
|
Krish Dholakia
|
27ff9d3d30
|
Merge pull request #9335 from BerriAI/litellm_dev_03_17_2025_p3
Contributor PR: Fix sagemaker too little data for content error
|
2025-03-18 23:24:07 -07:00 |
|