Krrish Dholakia
|
6a0cf3db50
|
fix(litellm_logging.py): always log the api base
Fixes issue where api base missing from spend logs due to refactor
|
2025-03-24 13:45:39 -07:00 |
|
Tyler Hutcherson
|
7864cd1f76
|
update redisvl dependency
|
2025-03-24 08:42:11 -04:00 |
|
Ishaan Jaff
|
6f7d618918
|
test tool call cost tracking
|
2025-03-22 19:47:13 -07:00 |
|
Ishaan Jaff
|
f21a0c2da7
|
Merge branch 'main' into litellm_exp_mcp_server
|
2025-03-22 18:51:25 -07:00 |
|
Krish Dholakia
|
d3baaf7961
|
Merge pull request #9467 from BerriAI/litellm_dev_03_22_2025_p1
Refactor vertex ai passthrough routes - fixes unpredictable behaviour w/ auto-setting default_vertex_region on router model add
|
2025-03-22 14:11:57 -07:00 |
|
Krrish Dholakia
|
3ce3689282
|
test: migrate testing
|
2025-03-22 12:48:53 -07:00 |
|
Krrish Dholakia
|
92d4486a2c
|
fix(llm_passthrough_endpoints.py): raise verbose error if credentials not found on proxy
|
2025-03-22 11:49:51 -07:00 |
|
Ishaan Jaff
|
792a2d6115
|
test_is_chunk_non_empty_with_annotations
|
2025-03-22 11:41:53 -07:00 |
|
Krrish Dholakia
|
be72ecc23f
|
test: add more e2e testing
|
2025-03-22 11:35:57 -07:00 |
|
Krrish Dholakia
|
06e69a414e
|
fix(vertex_ai/common_utils.py): fix handling constructed url with default vertex config
|
2025-03-22 11:32:01 -07:00 |
|
Krrish Dholakia
|
b44b3bd36b
|
feat(llm_passthrough_endpoints.py): base case passing for refactored vertex passthrough route
|
2025-03-22 11:06:52 -07:00 |
|
Krrish Dholakia
|
94d3413335
|
refactor(llm_passthrough_endpoints.py): refactor vertex passthrough to use common llm passthrough handler.py
|
2025-03-22 10:42:46 -07:00 |
|
Krish Dholakia
|
950edd76b3
|
Merge pull request #9454 from BerriAI/litellm_dev_03_21_2025_p3
Helm unit test / unit-test (push) Successful in 20s
Read Version from pyproject.toml / read-version (push) Successful in 39s
Fix route check for non-proxy admins on jwt auth
|
2025-03-21 22:32:46 -07:00 |
|
Ishaan Jaff
|
ed74b419a3
|
Merge pull request #9436 from BerriAI/litellm_mcp_interface
[Feat] LiteLLM x MCP Bridge - Use MCP Tools with LiteLLM
|
2025-03-21 20:42:16 -07:00 |
|
Ishaan Jaff
|
7b5c0de978
|
test_tools.py
|
2025-03-21 18:38:24 -07:00 |
|
Ishaan Jaff
|
881ac23964
|
test_transform_openai_tool_call_to_mcp_tool_call_request tests
|
2025-03-21 18:24:43 -07:00 |
|
Krrish Dholakia
|
1ebdeb852c
|
test(test_internal_user_endpoints.py): add unit testing to handle user_email=None
|
2025-03-21 18:06:20 -07:00 |
|
Krish Dholakia
|
dfb41c927e
|
Merge pull request #9448 from BerriAI/litellm_dev_03_21_2025_p2
Read Version from pyproject.toml / read-version (push) Successful in 15s
Helm unit test / unit-test (push) Successful in 19s
Set max size limit to in-memory cache item - prevents OOM errors
|
2025-03-21 17:51:46 -07:00 |
|
Krrish Dholakia
|
c7b17495a1
|
test: add unit testing
|
2025-03-21 15:01:19 -07:00 |
|
Krrish Dholakia
|
dfea55a1e7
|
fix(in_memory_cache.py): add max value limits to in-memory cache. Prevents OOM errors in prod
|
2025-03-21 14:51:12 -07:00 |
|
Krrish Dholakia
|
81a1494a51
|
test: add unit testing
|
2025-03-21 10:35:36 -07:00 |
|
Ishaan Jaff
|
5bc07b0c5d
|
test tool registry
|
2025-03-20 22:03:56 -07:00 |
|
Ishaan Jaff
|
c44fe8bd90
|
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 21s
[Feat] OpenAI o1-pro Responses API streaming support
|
2025-03-20 21:54:43 -07:00 |
|
Ishaan Jaff
|
15048de5e2
|
test_prepare_fake_stream_request
|
2025-03-20 14:50:00 -07:00 |
|
Krrish Dholakia
|
46d68a61c8
|
fix: fix testing
|
2025-03-20 14:37:58 -07:00 |
|
Ishaan Jaff
|
1bd7443c25
|
Merge pull request #9384 from BerriAI/litellm_prompt_management_custom
[Feat] - Allow building custom prompt management integration
|
2025-03-19 21:06:41 -07:00 |
|
Ishaan Jaff
|
247e4d09ee
|
Merge branch 'main' into litellm_fix_ssl_verify
|
2025-03-19 21:03:06 -07:00 |
|
Ishaan Jaff
|
30fdd934a4
|
TestCustomPromptManagement
|
2025-03-19 17:40:15 -07:00 |
|
Krish Dholakia
|
9432d1a865
|
Merge pull request #9357 from BerriAI/litellm_dev_03_18_2025_p2
fix(lowest_tpm_rpm_v2.py): support batch writing increments to redis
|
2025-03-19 15:45:10 -07:00 |
|
Krrish Dholakia
|
041d5391eb
|
test(test_proxy_server.py): make test work on ci/cd
|
2025-03-19 12:01:37 -07:00 |
|
Krrish Dholakia
|
858da57b3c
|
test(test_proxy_server.py): add unit test to ensure get credentials only called behind feature flag
|
2025-03-19 11:44:00 -07:00 |
|
Krrish Dholakia
|
9adad381b4
|
fix(common_utils.py): handle cris only model
Fixes https://github.com/BerriAI/litellm/issues/9161#issuecomment-2734905153
|
2025-03-18 23:35:43 -07:00 |
|
Krrish Dholakia
|
084e8c425c
|
refactor(base_routing_strategy.py): fix function names
|
2025-03-18 22:41:02 -07:00 |
|
Krrish Dholakia
|
3033c40739
|
fix(base_routing_strategy.py): fix base to handle no running event loop
run in a separate thread
|
2025-03-18 22:20:39 -07:00 |
|
Krrish Dholakia
|
a3d000baaa
|
fix(test_base_routing_strategy.py): add unit testing for new base routing strategy test
|
2025-03-18 19:59:06 -07:00 |
|
Ishaan Jaff
|
65083ca8da
|
get_openai_client_cache_key
|
2025-03-18 18:35:50 -07:00 |
|
Ishaan Jaff
|
40418c7bd8
|
test_openai_client_reuse
|
2025-03-18 18:13:36 -07:00 |
|
Ishaan Jaff
|
d5150e000d
|
test openai common utils
|
2025-03-18 18:08:24 -07:00 |
|
Ishaan Jaff
|
55ea2370ba
|
Union[TranscriptionResponse, Coroutine[Any, Any, TranscriptionResponse]]:
|
2025-03-18 14:23:14 -07:00 |
|
Ishaan Jaff
|
6987a73e36
|
initialize_azure_sdk_client
|
2025-03-18 12:27:17 -07:00 |
|
Krrish Dholakia
|
00b338cb9c
|
test: refactor testing to handle routing correctly
|
2025-03-18 12:24:12 -07:00 |
|
Ishaan Jaff
|
d42f84dd4e
|
Merge pull request #9328 from BerriAI/litellm_control_internal_user_settings_ui
[UI] - Allow controlling default internal user settings on ui
|
2025-03-17 22:07:47 -07:00 |
|
Krish Dholakia
|
cd5024f3b1
|
Merge pull request #9333 from BerriAI/litellm_dev_03_17_2025_p2
fix(ollama/completions/transformation.py): pass prompt, untemplated o…
|
2025-03-17 21:48:30 -07:00 |
|
Ishaan Jaff
|
5400615ce8
|
Merge pull request #9329 from BerriAI/litellm_fix_reset_budget_job
[Bug fix] Reset Budget Job
|
2025-03-17 21:46:08 -07:00 |
|
Krish Dholakia
|
27768a4866
|
Merge pull request #9330 from BerriAI/litellm_dev_03_17_2025_p1
Litellm dev 03 17 2025 p1
|
2025-03-17 19:57:25 -07:00 |
|
Ishaan Jaff
|
fbace8d041
|
unit test reset budget job
|
2025-03-17 19:54:51 -07:00 |
|
Krrish Dholakia
|
22faf7d232
|
fix(ollama/completions/transformation.py): pass prompt, untemplated on /completions request
Fixes https://github.com/BerriAI/litellm/issues/6900
|
2025-03-17 18:35:44 -07:00 |
|
Ishaan Jaff
|
b4568762d2
|
update_internal_user_settings
|
2025-03-17 18:26:42 -07:00 |
|
Krrish Dholakia
|
8e27b2026a
|
fix(http_handler.py): support reading ssl security level from env var
Allows user to specify lower security settings
|
2025-03-17 15:48:31 -07:00 |
|
Krrish Dholakia
|
078e2d341b
|
feat(cost_calculator.py): support reading litellm response cost header in client sdk
allows consistent cost tracking when sdk is calling proxy
|
2025-03-17 15:12:01 -07:00 |
|