Commit graph

2426 commits

Author SHA1 Message Date
Ishaan Jaff
838ab59a84 return correct fields in NewUserResponse 2024-05-31 10:31:19 -07:00
Ishaan Jaff
f3687f68e7 fix use send_invite_email on /key/generate 2024-05-31 09:14:25 -07:00
Ishaan Jaff
6b7f5a92b6 fix - add send_invite_email to GenerateKeyRequest 2024-05-31 09:12:14 -07:00
Ishaan Jaff
1921e51b95 feat - send_invite_email on /user/new 2024-05-31 08:55:49 -07:00
Ishaan Jaff
f9862be049
Merge branch 'main' into litellm_invite_link_flow_2 2024-05-31 08:14:52 -07:00
Krish Dholakia
deb87f71e3
Merge pull request #3670 from lj-wego/fix-pydantic-warnings-again
Fix warnings from pydantic
2024-05-30 22:57:21 -07:00
Krrish Dholakia
e2b34165e7 feat(proxy_server.py): add assistants api endpoints to proxy server 2024-05-30 22:44:43 -07:00
Ishaan Jaff
aada7b4bd3 ui new build 2024-05-30 20:59:51 -07:00
Ishaan Jaff
9a55365791 fix - rate lmit error dashboard 2024-05-30 20:56:05 -07:00
lj
1ff3afc1ae
Fix class config deprecation warning 2024-05-31 11:37:57 +08:00
lj
27ed72405b
Merge branch 'main' into fix-pydantic-warnings-again 2024-05-31 11:35:42 +08:00
Ishaan Jaff
4fe8da7cd3 feat - show global/activity/exception 2024-05-30 17:55:38 -07:00
Ishaan Jaff
eeaf4d83e1 ui - new build 2024-05-30 17:35:37 -07:00
Ishaan Jaff
6b97713f91
Merge pull request #3931 from BerriAI/litellm_ui_fix_editing_users
[Feat] UI - cleanup editing users
2024-05-30 17:33:08 -07:00
Krish Dholakia
d3a247bf20
Merge pull request #3928 from BerriAI/litellm_audio_speech_endpoint
feat(main.py): support openai tts endpoint
2024-05-30 17:30:42 -07:00
Krrish Dholakia
d65b7fe01b fix(main.py): add logging to audio_transcription calls 2024-05-30 16:57:11 -07:00
Ishaan Jaff
aa7d127901 fix view users table 2024-05-30 16:51:49 -07:00
Krrish Dholakia
93166cdabf fix(openai.py): fix openai response for /audio/speech endpoint 2024-05-30 16:41:06 -07:00
Ishaan Jaff
67377e357d use proxy_admin_viewer, and internal_user_viewer 2024-05-30 16:37:19 -07:00
Ishaan Jaff
bfb06d37b7 fix ui_get_available_role 2024-05-30 16:12:41 -07:00
Ishaan Jaff
a3b1493bab add doc string for enum 2024-05-30 16:10:18 -07:00
Ishaan Jaff
338fc82ce8 fix LitellmUserRoles 2024-05-30 15:35:33 -07:00
Ishaan Jaff
3d4cda3373 fix /user/new 2024-05-30 15:33:23 -07:00
Ishaan Jaff
b03f00062e add better details about LitellmUserRoles 2024-05-30 14:47:04 -07:00
Ishaan Jaff
0a3a37a196 fix - /user/new 2024-05-30 14:41:47 -07:00
Ishaan Jaff
a4b6a959d8 fix literal usage 2024-05-30 14:28:53 -07:00
Ishaan Jaff
4861ff2fd4 fix user_role test 2024-05-30 14:10:08 -07:00
Ishaan Jaff
6166381579 fix migrate to LitellmUserRoles 2024-05-30 13:57:50 -07:00
Ishaan Jaff
b29adeb6c6 feat - move to using enums for roles 2024-05-30 13:55:45 -07:00
Ishaan Jaff
fc54acc17a fix - use LitellmUserRoles 2024-05-30 13:51:58 -07:00
Krish Dholakia
61454eaa0f
Merge pull request #3926 from BerriAI/litellm_404_error
fix(router.py): cooldown on 404 errors
2024-05-30 12:03:09 -07:00
Krrish Dholakia
32bfb685f5 fix(router.py): cooldown on 404 errors
https://github.com/BerriAI/litellm/issues/3884
2024-05-30 10:57:38 -07:00
Ishaan Jaff
9f189ac91c fix - verbose_proxy_logger debug 2024-05-30 08:09:31 -07:00
Krrish Dholakia
3167bee25a fix(proxy_cli.py): enable json logging via litellm_settings param on config
allows user to enable json logs without needing to figure out env variables
2024-05-29 21:41:20 -07:00
Ishaan Jaff
d451bfbf50 ui new build 2024-05-29 21:11:09 -07:00
Ishaan Jaff
c39db5686d ui - new build 2024-05-29 18:49:22 -07:00
Krish Dholakia
67973e5344
Merge pull request #3910 from BerriAI/litellm_fix_end_user_max_budget
fix(proxy_server.py): fix end user object check when master key used
2024-05-29 18:04:56 -07:00
Ishaan Jaff
b7fcec8835 fix batch tags 2024-05-29 17:51:39 -07:00
Ishaan Jaff
ea30769190
Merge pull request #3897 from BerriAI/litellm_fix_latency_cal
ui  - fix latency analytics on `completion_tokens`
2024-05-29 17:32:17 -07:00
Ishaan Jaff
f9b98c57a2
Merge pull request #3894 from BerriAI/litellm_update_analytics_tab
[Feat] UI update analytics tab to show human friendly usage vals
2024-05-29 17:31:21 -07:00
Krrish Dholakia
cfcf5969c8 fix(proxy_server.py): fix end user object check when master key used
check if end user max budget exceeded for master key
2024-05-29 17:20:59 -07:00
Krish Dholakia
8556f4afb0
Merge pull request #3902 from BerriAI/litellm_openai_compatible_endpoint_fix
fix(openai.py): only allow 'user' as optional param if openai model
2024-05-29 16:34:23 -07:00
Krrish Dholakia
f729370890 feat(proxy_server.py): emit webhook event whenever customer spend is tracked
Closes https://github.com/BerriAI/litellm/issues/3903
2024-05-29 15:59:32 -07:00
Krrish Dholakia
3d32b00821 fix(openai.py): only allow 'user' as optional param if openai model 2024-05-29 15:15:02 -07:00
Krrish Dholakia
8f0019c241 docs(customers.md): add customer cost tracking to docs 2024-05-29 14:55:33 -07:00
Ishaan Jaff
02598ae988 fix(proxy_server.py): fix latency calc for avg output token 2024-05-29 09:49:45 -07:00
Ishaan Jaff
e252daaf2b ui - fix latency analytics completion_tokens 2024-05-29 09:39:53 -07:00
Ishaan Jaff
5958288ebb fix /global/activity/model 2024-05-29 08:58:25 -07:00
Krish Dholakia
e838bd1c79
Merge branch 'main' into litellm_batch_completions 2024-05-28 22:38:05 -07:00
Ishaan Jaff
6e9cc26ca1
Merge pull request #3888 from BerriAI/litellm_add_files_proxy
[Feat] LiteLLM Proxy Add `POST /v1/files` and `GET /v1/files`
2024-05-28 22:36:33 -07:00