Commit graph

1767 commits

Author SHA1 Message Date
Ishaan Jaff
a7676a28d8 feat - send_invite_email on /user/new 2024-05-31 08:55:49 -07:00
Ishaan Jaff
2d81ecdf8b Merge branch 'main' into litellm_invite_link_flow_2 2024-05-31 08:14:52 -07:00
Krrish Dholakia
2fdf4a7bb4 feat(proxy_server.py): add assistants api endpoints to proxy server 2024-05-30 22:44:43 -07:00
Ishaan Jaff
4ddfc1b98b fix - rate lmit error dashboard 2024-05-30 20:56:05 -07:00
Ishaan Jaff
e981f342bc feat - show global/activity/exception 2024-05-30 17:55:38 -07:00
Krish Dholakia
73e3dba2f6 Merge pull request #3928 from BerriAI/litellm_audio_speech_endpoint
feat(main.py): support openai tts endpoint
2024-05-30 17:30:42 -07:00
Krrish Dholakia
6b4153ff03 fix(main.py): add logging to audio_transcription calls 2024-05-30 16:57:11 -07:00
Krrish Dholakia
eb159b64e1 fix(openai.py): fix openai response for /audio/speech endpoint 2024-05-30 16:41:06 -07:00
Ishaan Jaff
b58c8cd253 fix ui_get_available_role 2024-05-30 16:12:41 -07:00
Ishaan Jaff
eb8cc5a21e fix /user/new 2024-05-30 15:33:23 -07:00
Ishaan Jaff
b7dabe9bed fix literal usage 2024-05-30 14:28:53 -07:00
Ishaan Jaff
89bb364775 fix user_role test 2024-05-30 14:10:08 -07:00
Ishaan Jaff
ff33241ce2 feat - move to using enums for roles 2024-05-30 13:55:45 -07:00
Ishaan Jaff
0dd84b66c2 fix - verbose_proxy_logger debug 2024-05-30 08:09:31 -07:00
Krish Dholakia
7ef23e6fe9 Merge pull request #3910 from BerriAI/litellm_fix_end_user_max_budget
fix(proxy_server.py): fix end user object check when master key used
2024-05-29 18:04:56 -07:00
Ishaan Jaff
60d62c06a5 fix batch tags 2024-05-29 17:51:39 -07:00
Ishaan Jaff
2c35d2ee95 Merge pull request #3897 from BerriAI/litellm_fix_latency_cal
ui  - fix latency analytics on `completion_tokens`
2024-05-29 17:32:17 -07:00
Ishaan Jaff
88e13b8f8b Merge pull request #3894 from BerriAI/litellm_update_analytics_tab
[Feat] UI update analytics tab to show human friendly usage vals
2024-05-29 17:31:21 -07:00
Krrish Dholakia
bb3c786194 fix(proxy_server.py): fix end user object check when master key used
check if end user max budget exceeded for master key
2024-05-29 17:20:59 -07:00
Krrish Dholakia
df0365175a feat(proxy_server.py): emit webhook event whenever customer spend is tracked
Closes https://github.com/BerriAI/litellm/issues/3903
2024-05-29 15:59:32 -07:00
Krrish Dholakia
a4dae8e9f1 docs(customers.md): add customer cost tracking to docs 2024-05-29 14:55:33 -07:00
Ishaan Jaff
0521fec230 fix(proxy_server.py): fix latency calc for avg output token 2024-05-29 09:49:45 -07:00
Ishaan Jaff
467e6dd7dd ui - fix latency analytics completion_tokens 2024-05-29 09:39:53 -07:00
Ishaan Jaff
41604b53a8 fix /global/activity/model 2024-05-29 08:58:25 -07:00
Krish Dholakia
4fd3994b4e Merge branch 'main' into litellm_batch_completions 2024-05-28 22:38:05 -07:00
Ishaan Jaff
d8245cbccb Merge pull request #3888 from BerriAI/litellm_add_files_proxy
[Feat] LiteLLM Proxy Add `POST /v1/files` and `GET /v1/files`
2024-05-28 22:36:33 -07:00
Krrish Dholakia
012bde0b07 fix(router.py): support comma-separated model list for batch completion fastest response 2024-05-28 21:34:37 -07:00
Krrish Dholakia
792b25c772 feat(proxy_server.py): enable batch completion fastest response calls on proxy
introduces new `fastest_response` flag for enabling the call
2024-05-28 20:09:31 -07:00
Ishaan Jaff
18830e58e9 feat - add file endpoints proxy 2024-05-28 20:03:13 -07:00
Krrish Dholakia
4dd7b05406 feat(proxy_server.py): give request-level breakdown if ttft metric is selected for ju
st that day
2024-05-28 18:09:22 -07:00
Ishaan Jaff
ea609a0b14 fear support GET /v1/batches{batch_id} 2024-05-28 17:58:34 -07:00
Ishaan Jaff
e14e5e8f8e feat add v1/batches 2024-05-28 17:49:36 -07:00
Ishaan Jaff
b38196fbd9 fix app mounting ui 2024-05-28 08:14:15 -07:00
Ishaan Jaff
f878c54986 Merge pull request #3874 from BerriAI/litellm_make_email_alerts_free
[Feat] Add, Test Email Alerts on Admin UI
2024-05-28 08:09:14 -07:00
Krrish Dholakia
a4634ef6d2 fix(proxy_server.py): handle next js route serving on fastapi 2024-05-28 01:09:05 -07:00
Krrish Dholakia
991fc9563a feat: e2e flow complete - admin can invite new users to proxy via invite links
Completes https://github.com/BerriAI/litellm/issues/3863
2024-05-27 23:19:30 -07:00
Ishaan Jaff
c8d3af1117 fix - testing email alerting 2024-05-27 22:46:45 -07:00
Krish Dholakia
e2c15e84a0 Merge branch 'main' into litellm_invite_users_via_link 2024-05-27 20:36:43 -07:00
Krrish Dholakia
b5e17de0bc feat(proxy_server.py): allow admin to invite users via invite link
Closes https://github.com/BerriAI/litellm/issues/3863
2024-05-27 20:32:25 -07:00
Ishaan Jaff
46d506169a Merge pull request #3870 from BerriAI/litellm_rename_end_user
[Feat] Rename `/end/user/new` -> `/customer/new` (maintain backwards compatibility)
2024-05-27 19:42:14 -07:00
Ishaan Jaff
a27b1dd13d fix - working customer/delete 2024-05-27 19:24:20 -07:00
Krish Dholakia
7f4a6ad956 Merge pull request #3869 from BerriAI/litellm_public_model_hubg
Admin UI - Public model hub
2024-05-27 19:14:11 -07:00
Ishaan Jaff
39936fa726 fix - /customer/update 2024-05-27 19:02:20 -07:00
Ishaan Jaff
d9b7d078de feat - add validation for existing customers 2024-05-27 18:29:09 -07:00
Ishaan Jaff
a997e8f811 docs string - > end user /new 2024-05-27 18:20:44 -07:00
Ishaan Jaff
a90b18a741 feat - rename end_user -> customer 2024-05-27 18:17:46 -07:00
Krrish Dholakia
abdc5b357f feat(model_hub.tsx): enable admin to expose a public model hub 2024-05-27 17:45:45 -07:00
Ishaan Jaff
5fed67dcc3 Merge pull request #3868 from BerriAI/litellm_show_updated_created_models
[Feat] Show Created at, Created by on `Models` Page
2024-05-27 16:32:29 -07:00
Ishaan Jaff
4d14366ae3 backend - add audit logs for adding models 2024-05-27 16:29:51 -07:00
Ishaan Jaff
d7043baf6d router - include updated at and created at in model info 2024-05-27 15:53:16 -07:00