Ishaan Jaff
|
a9123d961d
|
fix _reset_budget_for_team
|
2025-03-17 19:34:44 -07:00 |
|
Ishaan Jaff
|
91f4d4d865
|
fix _reset_budget_for_key/team/user
|
2025-03-17 18:56:24 -07:00 |
|
Krrish Dholakia
|
22faf7d232
|
fix(ollama/completions/transformation.py): pass prompt, untemplated on /completions request
Fixes https://github.com/BerriAI/litellm/issues/6900
|
2025-03-17 18:35:44 -07:00 |
|
Ishaan Jaff
|
8441f2f55c
|
fix DefaultInternalUserParams
|
2025-03-17 18:13:15 -07:00 |
|
Ishaan Jaff
|
93e4a36300
|
DefaultInternalUserParams
|
2025-03-17 18:11:19 -07:00 |
|
Krrish Dholakia
|
c4b2e0ae3d
|
fix(streaming_handler.py): support logging complete streaming response on cache hit
|
2025-03-17 18:10:39 -07:00 |
|
Ishaan Jaff
|
5d8b288570
|
fix get_sso_settings
|
2025-03-17 18:06:09 -07:00 |
|
Ishaan Jaff
|
5e1e25d98d
|
is_internal_user_role
|
2025-03-17 17:55:35 -07:00 |
|
Ishaan Jaff
|
8ff8ad82c9
|
DefaultInternalUserParams
|
2025-03-17 17:34:05 -07:00 |
|
Krrish Dholakia
|
dd9e79adbd
|
fix(streaming_handler.py): emit deep copy of completed chunk
|
2025-03-17 17:26:21 -07:00 |
|
Emerson Gomes
|
18c2592f2b
|
fix redis serialization issue with Redis + lowest latency strategy
|
2025-03-17 19:19:20 -05:00 |
|
Andrew Smith
|
1b2439cb82
|
Merge branch 'BerriAI:main' into main
|
2025-03-18 11:10:02 +11:00 |
|
Andrew Smith
|
81a7cf0f44
|
Update handler.py to use prepared_request.body for input
|
2025-03-18 11:07:38 +11:00 |
|
Ishaan Jaff
|
508b474a1f
|
update_internal_user_settings
|
2025-03-17 17:03:06 -07:00 |
|
Ishaan Jaff
|
02db6f9d86
|
fix display of settings
|
2025-03-17 16:56:49 -07:00 |
|
Krrish Dholakia
|
057c774c14
|
fix(http_handler.py): fix typing error
|
2025-03-17 16:42:32 -07:00 |
|
Ishaan Jaff
|
2607676e3f
|
UISSOSettings
|
2025-03-17 16:29:32 -07:00 |
|
Andrew Smith
|
a92e99e946
|
Update handler.py to use prepared_request.body
|
2025-03-18 10:23:32 +11:00 |
|
Ishaan Jaff
|
0af0de8a96
|
model_dump()
|
2025-03-17 15:53:36 -07:00 |
|
Ishaan Jaff
|
401630833f
|
add types for internal user settings
|
2025-03-17 15:52:01 -07:00 |
|
Krrish Dholakia
|
8e27b2026a
|
fix(http_handler.py): support reading ssl security level from env var
Allows user to specify lower security settings
|
2025-03-17 15:48:31 -07:00 |
|
Krrish Dholakia
|
078e2d341b
|
feat(cost_calculator.py): support reading litellm response cost header in client sdk
allows consistent cost tracking when sdk is calling proxy
|
2025-03-17 15:12:01 -07:00 |
|
Ishaan Jaff
|
ce9ab69be2
|
add UISSOSettings
|
2025-03-17 15:09:54 -07:00 |
|
Krrish Dholakia
|
db92956ae3
|
fix(redis_cache.py): add 5s default timeout
|
2025-03-17 14:27:36 -07:00 |
|
Krish Dholakia
|
d0d8ec2c40
|
Merge branch 'main' into litellm_dev_03_16_2025_p1
|
2025-03-17 10:02:53 -07:00 |
|
Krrish Dholakia
|
a5b497667c
|
fix(logging_utils.py): revert change
|
2025-03-16 21:04:41 -07:00 |
|
Krrish Dholakia
|
a99251a4ab
|
fix(streaming_handler.py): raise stop iteration post-finish reason
|
2025-03-16 20:40:41 -07:00 |
|
Krrish Dholakia
|
bde9ae8a95
|
fix(litellm_logging.py): remove unused import
|
2025-03-16 20:24:27 -07:00 |
|
Krrish Dholakia
|
c0a76427d2
|
fix(streaming_handler.py): pass complete streaming response on completion
|
2025-03-16 20:22:12 -07:00 |
|
Krrish Dholakia
|
08b297230e
|
fix(streaming_handler.py): return model response on finished chunk
|
2025-03-16 13:05:46 -07:00 |
|
Krrish Dholakia
|
b093157369
|
fix(converse_transformation.py): fix linting error
|
2025-03-15 19:33:17 -07:00 |
|
Krrish Dholakia
|
5dc46f0cf7
|
fix(converse_transformation.py): fix encoding model
|
2025-03-15 14:03:37 -07:00 |
|
Krrish Dholakia
|
612d5a284d
|
refactor(litellm_logging.py): delegate returning a complete response to the streaming_handler
Removes incorrect logic for calculating complete streaming response from litellm logging
|
2025-03-15 09:55:33 -07:00 |
|
Krrish Dholakia
|
dd2c980d5b
|
fix(utils.py): Prevents final chunk w/ usage from being ignored
Fixes https://github.com/BerriAI/litellm/issues/7112
|
2025-03-15 09:12:14 -07:00 |
|
Krish Dholakia
|
d4caaae1be
|
Merge pull request #9274 from BerriAI/litellm_contributor_rebase_branch
Read Version from pyproject.toml / read-version (push) Successful in 43s
Helm unit test / unit-test (push) Successful in 50s
Litellm contributor rebase branch
|
2025-03-14 21:57:49 -07:00 |
|
Ishaan Jaff
|
a256108f3f
|
ui new build
|
2025-03-14 21:46:40 -07:00 |
|
Ishaan Jaff
|
9dc962511f
|
ui new build
|
2025-03-14 21:29:42 -07:00 |
|
Ishaan Jaff
|
5290d141b7
|
ui new build
|
2025-03-14 21:23:13 -07:00 |
|
Ishaan Jaff
|
12188f286c
|
Merge pull request #9258 from BerriAI/litellm_fix_models_view_edit
(UI) Fix model edit + delete - instantly show edit + deletes to models
|
2025-03-14 21:21:46 -07:00 |
|
Ishaan Jaff
|
bda5fe0fcf
|
Merge pull request #9272 from BerriAI/litellm_add_test_connection_button
[Feat] UI - Add Test Connection
|
2025-03-14 21:16:44 -07:00 |
|
Krrish Dholakia
|
a9dceacc1b
|
fix(factory.py): reduce ollama pt LOC < 50
|
2025-03-14 21:10:05 -07:00 |
|
Ishaan Jaff
|
d7e10fee79
|
fix code quality
|
2025-03-14 21:06:28 -07:00 |
|
Ishaan Jaff
|
cbf0fa44b4
|
undo changes to route llm request
|
2025-03-14 21:05:51 -07:00 |
|
Krish Dholakia
|
82f31beee5
|
Merge pull request #9271 from BerriAI/litellm_rc_03_14_2025_patch_1
Litellm rc 03 14 2025 patch 1
|
2025-03-14 20:57:22 -07:00 |
|
Krrish Dholakia
|
b15c06ee94
|
fix(team_endpoints.py): fix linting error
|
2025-03-14 20:51:21 -07:00 |
|
Krish Dholakia
|
59fd58643b
|
Merge pull request #9261 from briandevvn/fix_ollama_pt
Fix "system" role has become unacceptable in ollama
|
2025-03-14 20:13:28 -07:00 |
|
Krrish Dholakia
|
26226d475f
|
feat(proxy_server.py): support retrieving models for a team, if user is a member - via /models?team_id
Allows user to see team models on UI when creating a key
|
2025-03-14 19:34:06 -07:00 |
|
Ishaan Jaff
|
6787d0dabe
|
test_model_connection
|
2025-03-14 18:33:49 -07:00 |
|
Krrish Dholakia
|
621d193727
|
build: new ui build
|
2025-03-14 18:18:03 -07:00 |
|
Ishaan Jaff
|
5a6da56058
|
fix endpoint_data
|
2025-03-14 17:21:01 -07:00 |
|