Krrish Dholakia
|
a93471a3c7
|
feat(team_endpoints.py): unfurl 'all-proxy-models' on team info endpoints
|
2025-03-14 16:54:24 -07:00 |
|
Krrish Dholakia
|
08abee1990
|
fix: fix linting error
|
2025-03-14 14:17:28 -07:00 |
|
Krrish Dholakia
|
f089b1e23f
|
feat(endpoints.py): support adding credentials by model id
Allows user to reuse existing model credentials
|
2025-03-14 12:32:32 -07:00 |
|
Ishaan Jaff
|
d67fc03e20
|
fix endpoint
|
2025-03-14 12:26:17 -07:00 |
|
Krrish Dholakia
|
605a4d1121
|
feat(endpoints.py): enable retrieving existing credentials by model name
Enables reusing existing credentials
|
2025-03-14 12:02:50 -07:00 |
|
Ishaan Jaff
|
c84717c9e0
|
add health/test_connection
|
2025-03-14 11:30:24 -07:00 |
|
Ishaan Jaff
|
6742d3cb10
|
fix route llm request to allow non-router models
|
2025-03-14 11:10:10 -07:00 |
|
Brian Dev
|
12db28b0af
|
Support 'system' role ollama
|
2025-03-15 00:55:18 +07:00 |
|
Krrish Dholakia
|
6629354329
|
fix(endpoints.py): update credentials should update before storing
|
2025-03-14 10:42:17 -07:00 |
|
Krrish Dholakia
|
fc0d21794d
|
style: cleanup credential from leftnav - now in models tab
|
2025-03-14 10:14:21 -07:00 |
|
Ishaan Jaff
|
c36e5cae50
|
backend instant delete model
|
2025-03-14 10:12:50 -07:00 |
|
Lucas Raschek
|
56d3e75b33
|
Map total tokens to prompt_tokens too
|
2025-03-14 18:04:43 +01:00 |
|
Ishaan Jaff
|
c2d290785a
|
ui new build
|
2025-03-13 21:44:32 -07:00 |
|
Ishaan Jaff
|
276a7089df
|
Merge pull request #9220 from BerriAI/litellm_qa_responses_api
[Fixes] Responses API - allow /responses and subpaths as LLM API route + Add exception mapping for responses API
|
2025-03-13 21:36:59 -07:00 |
|
Ishaan Jaff
|
241a36a74f
|
Merge pull request #9222 from BerriAI/litellm_snowflake_pr_mar_13
[Feat] Add Snowflake Cortex to LiteLLM
|
2025-03-13 21:35:39 -07:00 |
|
Krish Dholakia
|
f89dbb8ab3
|
Merge branch 'main' into litellm_dev_03_13_2025_p3
|
2025-03-13 20:12:16 -07:00 |
|
Ishaan Jaff
|
69b47cf738
|
fix code quality check
|
2025-03-13 20:10:41 -07:00 |
|
Ishaan Jaff
|
7827c275ba
|
exception_type
|
2025-03-13 20:09:32 -07:00 |
|
Krish Dholakia
|
e8c67f25e3
|
Merge pull request #9221 from BerriAI/litellm_dev_03_13_2025_p2
Support bedrock converse cache token tracking
|
2025-03-13 20:08:33 -07:00 |
|
Krish Dholakia
|
fd8a5960ec
|
Merge pull request #9216 from BerriAI/litellm_dev_03_12_2025_contributor_prs_p2
Litellm dev 03 12 2025 contributor prs p2
|
2025-03-13 20:03:57 -07:00 |
|
Krrish Dholakia
|
58a7351a73
|
fix: fix linting errors
|
2025-03-13 19:40:18 -07:00 |
|
Krrish Dholakia
|
8a6e4715aa
|
feat(converse_transformation.py): fix type for bedrock cache usage block
|
2025-03-13 19:33:22 -07:00 |
|
Krrish Dholakia
|
f2d0aaacbc
|
fix: fix linting errors
|
2025-03-13 19:26:46 -07:00 |
|
Krrish Dholakia
|
dc3b02920f
|
feat(model_management_endpoints.py): support audit logs on /model/add and /model/update endpoints
complete CUD endpoint audit logging on models + users
|
2025-03-13 19:17:40 -07:00 |
|
Krrish Dholakia
|
9145e8db77
|
feat: fix linting errors
|
2025-03-13 19:00:27 -07:00 |
|
Krrish Dholakia
|
37b30395c9
|
feat(model_management_endpoints.py): emit audit logs on model delete
|
2025-03-13 18:48:38 -07:00 |
|
Krrish Dholakia
|
e90b3d9c4c
|
feat(internal_user_endpoints.py): add audit logs on /user/update
|
2025-03-13 18:17:05 -07:00 |
|
Ishaan Jaff
|
d3781dfe36
|
fix linting errors
|
2025-03-13 16:58:34 -07:00 |
|
Krrish Dholakia
|
5cfae0e98a
|
feat(internal_user_endpoints.py): emit audit log on /user/new event
|
2025-03-13 16:47:58 -07:00 |
|
Ishaan Jaff
|
d7847feba9
|
Add stubbed routes to pass initial auth tests
|
2025-03-13 16:43:25 -07:00 |
|
Sunny Wan
|
c942f4cd86
|
Merge branch 'main' of https://github.com/SunnyWan59/litellm
|
2025-03-13 19:42:25 -04:00 |
|
Sunny Wan
|
70770b6aa4
|
Removed unnecessary code and refactored
|
2025-03-13 19:42:10 -04:00 |
|
Sunny Wan
|
f9a5109203
|
Merge branch 'BerriAI:main' into main
|
2025-03-13 19:37:22 -04:00 |
|
Krrish Dholakia
|
0af6cde994
|
fix(invoke_handler.py): support cache token tracking on converse streaming
|
2025-03-13 16:10:13 -07:00 |
|
Ishaan Jaff
|
15d618f5b1
|
Add exception mapping for responses API
|
2025-03-13 15:57:58 -07:00 |
|
Krrish Dholakia
|
f99b1937db
|
feat(converse_transformation.py): translate converse usage block with cache creation values to openai format
|
2025-03-13 15:49:25 -07:00 |
|
Ishaan Jaff
|
1ee6b7852f
|
fix exception_type
|
2025-03-13 15:33:17 -07:00 |
|
Krrish Dholakia
|
997f2f0b3e
|
fix(aim.py): fix linting error
|
2025-03-13 15:32:42 -07:00 |
|
Krrish Dholakia
|
f17bc60593
|
test: patch test to avoid lakera changes to sensitivity
|
2025-03-13 15:18:08 -07:00 |
|
Ishaan Jaff
|
2543e3a902
|
fix auth add responses API to llm routes
|
2025-03-13 15:13:07 -07:00 |
|
Krrish Dholakia
|
5ffd3f56f8
|
fix(azure.py): track azure llm api latency metric
|
2025-03-13 14:47:35 -07:00 |
|
Krrish Dholakia
|
1cd57e95aa
|
fix: fix linting error
|
2025-03-13 14:33:19 -07:00 |
|
Krrish Dholakia
|
86ed6be85e
|
fix: fix learnlm test
|
2025-03-13 10:54:09 -07:00 |
|
Krish Dholakia
|
a2414d09c1
|
Merge pull request #8356 from hakasecurity/feature/aim-post-call
Aim Security post-call guardrails support
|
2025-03-13 10:48:42 -07:00 |
|
Krish Dholakia
|
2c011d9a93
|
Merge pull request #9123 from omrishiv/8911-fix-model-encoding
Fixes bedrock modelId encoding for Inference Profiles
|
2025-03-13 10:42:32 -07:00 |
|
Tomer Bin
|
6ebf11e4b3
|
ruff format
|
2025-03-13 08:58:29 +02:00 |
|
Tomer Bin
|
32dedbe551
|
CR
|
2025-03-13 08:54:00 +02:00 |
|
Tomer Bin
|
4a31b32a88
|
Support post-call guards for stream and non-stream responses
|
2025-03-13 08:53:54 +02:00 |
|
Krrish Dholakia
|
be35c9a663
|
docs(reasoning_content.md): clarify 'thinking' param support is from v1.63.0+
Read Version from pyproject.toml / read-version (push) Successful in 13s
Helm unit test / unit-test (push) Successful in 22s
Fixes https://github.com/BerriAI/litellm/issues/9171
|
2025-03-12 22:30:57 -07:00 |
|
Krish Dholakia
|
58e5f3e0c9
|
Merge pull request #9193 from youngchannelforyou/feat/gemini_response_status_code
(gemini)Handle HTTP 201 status code in Vertex AI response
|
2025-03-12 22:24:01 -07:00 |
|