Commit graph

2318 commits

Author SHA1 Message Date
Krish Dholakia
85f8907e0d Merge pull request #3670 from lj-wego/fix-pydantic-warnings-again
Fix warnings from pydantic
2024-05-30 22:57:21 -07:00
Ishaan Jaff
8bcb53137e ui new build 2024-05-30 20:59:51 -07:00
Ishaan Jaff
4ddfc1b98b fix - rate lmit error dashboard 2024-05-30 20:56:05 -07:00
lj
60e2f372ba Fix class config deprecation warning 2024-05-31 11:37:57 +08:00
lj
f1fe41db74 Merge branch 'main' into fix-pydantic-warnings-again 2024-05-31 11:35:42 +08:00
Ishaan Jaff
e981f342bc feat - show global/activity/exception 2024-05-30 17:55:38 -07:00
Ishaan Jaff
4daafab3df ui - new build 2024-05-30 17:35:37 -07:00
Ishaan Jaff
6e4c56ef08 Merge pull request #3931 from BerriAI/litellm_ui_fix_editing_users
[Feat] UI - cleanup editing users
2024-05-30 17:33:08 -07:00
Krish Dholakia
73e3dba2f6 Merge pull request #3928 from BerriAI/litellm_audio_speech_endpoint
feat(main.py): support openai tts endpoint
2024-05-30 17:30:42 -07:00
Krrish Dholakia
6b4153ff03 fix(main.py): add logging to audio_transcription calls 2024-05-30 16:57:11 -07:00
Ishaan Jaff
4662264552 fix view users table 2024-05-30 16:51:49 -07:00
Krrish Dholakia
eb159b64e1 fix(openai.py): fix openai response for /audio/speech endpoint 2024-05-30 16:41:06 -07:00
Ishaan Jaff
4ba5e2eb87 use proxy_admin_viewer, and internal_user_viewer 2024-05-30 16:37:19 -07:00
Ishaan Jaff
b58c8cd253 fix ui_get_available_role 2024-05-30 16:12:41 -07:00
Ishaan Jaff
5f07bb7c3b add doc string for enum 2024-05-30 16:10:18 -07:00
Ishaan Jaff
7f9f0360a9 fix LitellmUserRoles 2024-05-30 15:35:33 -07:00
Ishaan Jaff
eb8cc5a21e fix /user/new 2024-05-30 15:33:23 -07:00
Ishaan Jaff
6eb8186740 add better details about LitellmUserRoles 2024-05-30 14:47:04 -07:00
Ishaan Jaff
7a4e8f761a fix - /user/new 2024-05-30 14:41:47 -07:00
Ishaan Jaff
b7dabe9bed fix literal usage 2024-05-30 14:28:53 -07:00
Ishaan Jaff
89bb364775 fix user_role test 2024-05-30 14:10:08 -07:00
Ishaan Jaff
dc51fec22e fix migrate to LitellmUserRoles 2024-05-30 13:57:50 -07:00
Ishaan Jaff
ff33241ce2 feat - move to using enums for roles 2024-05-30 13:55:45 -07:00
Ishaan Jaff
78a6fb0c13 fix - use LitellmUserRoles 2024-05-30 13:51:58 -07:00
Krish Dholakia
29660c9748 Merge pull request #3926 from BerriAI/litellm_404_error
fix(router.py): cooldown on 404 errors
2024-05-30 12:03:09 -07:00
Krrish Dholakia
66e08cac9b fix(router.py): cooldown on 404 errors
https://github.com/BerriAI/litellm/issues/3884
2024-05-30 10:57:38 -07:00
Ishaan Jaff
0dd84b66c2 fix - verbose_proxy_logger debug 2024-05-30 08:09:31 -07:00
Krrish Dholakia
741bfb9cef fix(proxy_cli.py): enable json logging via litellm_settings param on config
allows user to enable json logs without needing to figure out env variables
2024-05-29 21:41:20 -07:00
Ishaan Jaff
ab416fe098 ui new build 2024-05-29 21:11:09 -07:00
Ishaan Jaff
8c0fd010d6 ui - new build 2024-05-29 18:49:22 -07:00
Krish Dholakia
7ef23e6fe9 Merge pull request #3910 from BerriAI/litellm_fix_end_user_max_budget
fix(proxy_server.py): fix end user object check when master key used
2024-05-29 18:04:56 -07:00
Ishaan Jaff
60d62c06a5 fix batch tags 2024-05-29 17:51:39 -07:00
Ishaan Jaff
2c35d2ee95 Merge pull request #3897 from BerriAI/litellm_fix_latency_cal
ui  - fix latency analytics on `completion_tokens`
2024-05-29 17:32:17 -07:00
Ishaan Jaff
88e13b8f8b Merge pull request #3894 from BerriAI/litellm_update_analytics_tab
[Feat] UI update analytics tab to show human friendly usage vals
2024-05-29 17:31:21 -07:00
Krrish Dholakia
bb3c786194 fix(proxy_server.py): fix end user object check when master key used
check if end user max budget exceeded for master key
2024-05-29 17:20:59 -07:00
Krish Dholakia
6da5cee4d0 Merge pull request #3902 from BerriAI/litellm_openai_compatible_endpoint_fix
fix(openai.py): only allow 'user' as optional param if openai model
2024-05-29 16:34:23 -07:00
Krrish Dholakia
df0365175a feat(proxy_server.py): emit webhook event whenever customer spend is tracked
Closes https://github.com/BerriAI/litellm/issues/3903
2024-05-29 15:59:32 -07:00
Krrish Dholakia
915e599772 fix(openai.py): only allow 'user' as optional param if openai model 2024-05-29 15:15:02 -07:00
Krrish Dholakia
a4dae8e9f1 docs(customers.md): add customer cost tracking to docs 2024-05-29 14:55:33 -07:00
Ishaan Jaff
0521fec230 fix(proxy_server.py): fix latency calc for avg output token 2024-05-29 09:49:45 -07:00
Ishaan Jaff
467e6dd7dd ui - fix latency analytics completion_tokens 2024-05-29 09:39:53 -07:00
Ishaan Jaff
41604b53a8 fix /global/activity/model 2024-05-29 08:58:25 -07:00
Krish Dholakia
4fd3994b4e Merge branch 'main' into litellm_batch_completions 2024-05-28 22:38:05 -07:00
Ishaan Jaff
d8245cbccb Merge pull request #3888 from BerriAI/litellm_add_files_proxy
[Feat] LiteLLM Proxy Add `POST /v1/files` and `GET /v1/files`
2024-05-28 22:36:33 -07:00
Krrish Dholakia
012bde0b07 fix(router.py): support comma-separated model list for batch completion fastest response 2024-05-28 21:34:37 -07:00
Krrish Dholakia
792b25c772 feat(proxy_server.py): enable batch completion fastest response calls on proxy
introduces new `fastest_response` flag for enabling the call
2024-05-28 20:09:31 -07:00
Ishaan Jaff
18830e58e9 feat - add file endpoints proxy 2024-05-28 20:03:13 -07:00
Krrish Dholakia
4dd7b05406 feat(proxy_server.py): give request-level breakdown if ttft metric is selected for ju
st that day
2024-05-28 18:09:22 -07:00
Ishaan Jaff
ea609a0b14 fear support GET /v1/batches{batch_id} 2024-05-28 17:58:34 -07:00
Ishaan Jaff
e14e5e8f8e feat add v1/batches 2024-05-28 17:49:36 -07:00