Commit graph

1676 commits

Author SHA1 Message Date
Ishaan Jaff
9b92ad96a6 fix - invirte users with invite links 2024-05-31 17:39:58 -07:00
Ishaan Jaff
3a334c8898
Merge branch 'main' into litellm_ui_invite_links 2024-05-31 17:30:25 -07:00
Ishaan Jaff
190ba2a3b9 fix allow PROXY_ADMIN_VIEW_ONLY to create an account 2024-05-31 17:29:17 -07:00
Ishaan Jaff
a049a52fde
Merge pull request #3942 from BerriAI/litellm_send_email_user_new
feat - `send_invite_email`  on /user/new
2024-05-31 17:28:32 -07:00
Ishaan Jaff
dd64ae2ec1 feat - onboarding/get_token 2024-05-31 16:20:10 -07:00
Ishaan Jaff
dd8bafb647 fix - when general_settings is None 2024-05-31 13:19:06 -07:00
Ishaan Jaff
0ed2099d71 fix test key gen prisma 2024-05-31 10:57:58 -07:00
Ishaan Jaff
838ab59a84 return correct fields in NewUserResponse 2024-05-31 10:31:19 -07:00
Ishaan Jaff
f3687f68e7 fix use send_invite_email on /key/generate 2024-05-31 09:14:25 -07:00
Ishaan Jaff
1921e51b95 feat - send_invite_email on /user/new 2024-05-31 08:55:49 -07:00
Ishaan Jaff
f9862be049
Merge branch 'main' into litellm_invite_link_flow_2 2024-05-31 08:14:52 -07:00
Krrish Dholakia
e2b34165e7 feat(proxy_server.py): add assistants api endpoints to proxy server 2024-05-30 22:44:43 -07:00
Ishaan Jaff
9a55365791 fix - rate lmit error dashboard 2024-05-30 20:56:05 -07:00
Ishaan Jaff
4fe8da7cd3 feat - show global/activity/exception 2024-05-30 17:55:38 -07:00
Krish Dholakia
d3a247bf20
Merge pull request #3928 from BerriAI/litellm_audio_speech_endpoint
feat(main.py): support openai tts endpoint
2024-05-30 17:30:42 -07:00
Krrish Dholakia
d65b7fe01b fix(main.py): add logging to audio_transcription calls 2024-05-30 16:57:11 -07:00
Krrish Dholakia
93166cdabf fix(openai.py): fix openai response for /audio/speech endpoint 2024-05-30 16:41:06 -07:00
Ishaan Jaff
bfb06d37b7 fix ui_get_available_role 2024-05-30 16:12:41 -07:00
Ishaan Jaff
3d4cda3373 fix /user/new 2024-05-30 15:33:23 -07:00
Ishaan Jaff
a4b6a959d8 fix literal usage 2024-05-30 14:28:53 -07:00
Ishaan Jaff
4861ff2fd4 fix user_role test 2024-05-30 14:10:08 -07:00
Ishaan Jaff
b29adeb6c6 feat - move to using enums for roles 2024-05-30 13:55:45 -07:00
Ishaan Jaff
9f189ac91c fix - verbose_proxy_logger debug 2024-05-30 08:09:31 -07:00
Krish Dholakia
67973e5344
Merge pull request #3910 from BerriAI/litellm_fix_end_user_max_budget
fix(proxy_server.py): fix end user object check when master key used
2024-05-29 18:04:56 -07:00
Ishaan Jaff
b7fcec8835 fix batch tags 2024-05-29 17:51:39 -07:00
Ishaan Jaff
ea30769190
Merge pull request #3897 from BerriAI/litellm_fix_latency_cal
ui  - fix latency analytics on `completion_tokens`
2024-05-29 17:32:17 -07:00
Ishaan Jaff
f9b98c57a2
Merge pull request #3894 from BerriAI/litellm_update_analytics_tab
[Feat] UI update analytics tab to show human friendly usage vals
2024-05-29 17:31:21 -07:00
Krrish Dholakia
cfcf5969c8 fix(proxy_server.py): fix end user object check when master key used
check if end user max budget exceeded for master key
2024-05-29 17:20:59 -07:00
Krrish Dholakia
f729370890 feat(proxy_server.py): emit webhook event whenever customer spend is tracked
Closes https://github.com/BerriAI/litellm/issues/3903
2024-05-29 15:59:32 -07:00
Krrish Dholakia
8f0019c241 docs(customers.md): add customer cost tracking to docs 2024-05-29 14:55:33 -07:00
Ishaan Jaff
02598ae988 fix(proxy_server.py): fix latency calc for avg output token 2024-05-29 09:49:45 -07:00
Ishaan Jaff
e252daaf2b ui - fix latency analytics completion_tokens 2024-05-29 09:39:53 -07:00
Ishaan Jaff
5958288ebb fix /global/activity/model 2024-05-29 08:58:25 -07:00
Krish Dholakia
e838bd1c79
Merge branch 'main' into litellm_batch_completions 2024-05-28 22:38:05 -07:00
Ishaan Jaff
6e9cc26ca1
Merge pull request #3888 from BerriAI/litellm_add_files_proxy
[Feat] LiteLLM Proxy Add `POST /v1/files` and `GET /v1/files`
2024-05-28 22:36:33 -07:00
Krrish Dholakia
1ebae6e7b0 fix(router.py): support comma-separated model list for batch completion fastest response 2024-05-28 21:34:37 -07:00
Krrish Dholakia
20106715d5 feat(proxy_server.py): enable batch completion fastest response calls on proxy
introduces new `fastest_response` flag for enabling the call
2024-05-28 20:09:31 -07:00
Ishaan Jaff
37e18a42fc feat - add file endpoints proxy 2024-05-28 20:03:13 -07:00
Krrish Dholakia
bffa79a8c5 feat(proxy_server.py): give request-level breakdown if ttft metric is selected for ju
st that day
2024-05-28 18:09:22 -07:00
Ishaan Jaff
0020672c19 fear support GET /v1/batches{batch_id} 2024-05-28 17:58:34 -07:00
Ishaan Jaff
c2e24b4ed8 feat add v1/batches 2024-05-28 17:49:36 -07:00
Ishaan Jaff
e6789a0150 fix app mounting ui 2024-05-28 08:14:15 -07:00
Ishaan Jaff
2069e36d1d
Merge pull request #3874 from BerriAI/litellm_make_email_alerts_free
[Feat] Add, Test Email Alerts on Admin UI
2024-05-28 08:09:14 -07:00
Krrish Dholakia
eee60b0686 fix(proxy_server.py): handle next js route serving on fastapi 2024-05-28 01:09:05 -07:00
Krrish Dholakia
293d5cf1f2 feat: e2e flow complete - admin can invite new users to proxy via invite links
Completes https://github.com/BerriAI/litellm/issues/3863
2024-05-27 23:19:30 -07:00
Ishaan Jaff
24c80e6bc7 fix - testing email alerting 2024-05-27 22:46:45 -07:00
Krish Dholakia
bcbc250a12
Merge branch 'main' into litellm_invite_users_via_link 2024-05-27 20:36:43 -07:00
Krrish Dholakia
86b66c13a4 feat(proxy_server.py): allow admin to invite users via invite link
Closes https://github.com/BerriAI/litellm/issues/3863
2024-05-27 20:32:25 -07:00
Ishaan Jaff
e1b46d4b6e
Merge pull request #3870 from BerriAI/litellm_rename_end_user
[Feat] Rename `/end/user/new` -> `/customer/new` (maintain backwards compatibility)
2024-05-27 19:42:14 -07:00
Ishaan Jaff
0feeb53868 fix - working customer/delete 2024-05-27 19:24:20 -07:00