Ishaan Jaff
|
aa7d127901
|
fix view users table
|
2024-05-30 16:51:49 -07:00 |
|
Krrish Dholakia
|
93166cdabf
|
fix(openai.py): fix openai response for /audio/speech endpoint
|
2024-05-30 16:41:06 -07:00 |
|
Ishaan Jaff
|
67377e357d
|
use proxy_admin_viewer, and internal_user_viewer
|
2024-05-30 16:37:19 -07:00 |
|
Ishaan Jaff
|
bfb06d37b7
|
fix ui_get_available_role
|
2024-05-30 16:12:41 -07:00 |
|
Ishaan Jaff
|
a3b1493bab
|
add doc string for enum
|
2024-05-30 16:10:18 -07:00 |
|
Ishaan Jaff
|
338fc82ce8
|
fix LitellmUserRoles
|
2024-05-30 15:35:33 -07:00 |
|
Ishaan Jaff
|
3d4cda3373
|
fix /user/new
|
2024-05-30 15:33:23 -07:00 |
|
Ishaan Jaff
|
b03f00062e
|
add better details about LitellmUserRoles
|
2024-05-30 14:47:04 -07:00 |
|
Ishaan Jaff
|
0a3a37a196
|
fix - /user/new
|
2024-05-30 14:41:47 -07:00 |
|
Ishaan Jaff
|
a4b6a959d8
|
fix literal usage
|
2024-05-30 14:28:53 -07:00 |
|
Ishaan Jaff
|
4861ff2fd4
|
fix user_role test
|
2024-05-30 14:10:08 -07:00 |
|
Ishaan Jaff
|
6166381579
|
fix migrate to LitellmUserRoles
|
2024-05-30 13:57:50 -07:00 |
|
Ishaan Jaff
|
b29adeb6c6
|
feat - move to using enums for roles
|
2024-05-30 13:55:45 -07:00 |
|
Ishaan Jaff
|
fc54acc17a
|
fix - use LitellmUserRoles
|
2024-05-30 13:51:58 -07:00 |
|
Krish Dholakia
|
61454eaa0f
|
Merge pull request #3926 from BerriAI/litellm_404_error
fix(router.py): cooldown on 404 errors
|
2024-05-30 12:03:09 -07:00 |
|
Krrish Dholakia
|
32bfb685f5
|
fix(router.py): cooldown on 404 errors
https://github.com/BerriAI/litellm/issues/3884
|
2024-05-30 10:57:38 -07:00 |
|
Ishaan Jaff
|
9f189ac91c
|
fix - verbose_proxy_logger debug
|
2024-05-30 08:09:31 -07:00 |
|
Krrish Dholakia
|
3167bee25a
|
fix(proxy_cli.py): enable json logging via litellm_settings param on config
allows user to enable json logs without needing to figure out env variables
|
2024-05-29 21:41:20 -07:00 |
|
Ishaan Jaff
|
d451bfbf50
|
ui new build
|
2024-05-29 21:11:09 -07:00 |
|
Ishaan Jaff
|
c39db5686d
|
ui - new build
|
2024-05-29 18:49:22 -07:00 |
|
Krish Dholakia
|
67973e5344
|
Merge pull request #3910 from BerriAI/litellm_fix_end_user_max_budget
fix(proxy_server.py): fix end user object check when master key used
|
2024-05-29 18:04:56 -07:00 |
|
Ishaan Jaff
|
b7fcec8835
|
fix batch tags
|
2024-05-29 17:51:39 -07:00 |
|
Ishaan Jaff
|
ea30769190
|
Merge pull request #3897 from BerriAI/litellm_fix_latency_cal
ui - fix latency analytics on `completion_tokens`
|
2024-05-29 17:32:17 -07:00 |
|
Ishaan Jaff
|
f9b98c57a2
|
Merge pull request #3894 from BerriAI/litellm_update_analytics_tab
[Feat] UI update analytics tab to show human friendly usage vals
|
2024-05-29 17:31:21 -07:00 |
|
Krrish Dholakia
|
cfcf5969c8
|
fix(proxy_server.py): fix end user object check when master key used
check if end user max budget exceeded for master key
|
2024-05-29 17:20:59 -07:00 |
|
Krish Dholakia
|
8556f4afb0
|
Merge pull request #3902 from BerriAI/litellm_openai_compatible_endpoint_fix
fix(openai.py): only allow 'user' as optional param if openai model
|
2024-05-29 16:34:23 -07:00 |
|
Krrish Dholakia
|
f729370890
|
feat(proxy_server.py): emit webhook event whenever customer spend is tracked
Closes https://github.com/BerriAI/litellm/issues/3903
|
2024-05-29 15:59:32 -07:00 |
|
Krrish Dholakia
|
3d32b00821
|
fix(openai.py): only allow 'user' as optional param if openai model
|
2024-05-29 15:15:02 -07:00 |
|
Krrish Dholakia
|
8f0019c241
|
docs(customers.md): add customer cost tracking to docs
|
2024-05-29 14:55:33 -07:00 |
|
Ishaan Jaff
|
02598ae988
|
fix(proxy_server.py): fix latency calc for avg output token
|
2024-05-29 09:49:45 -07:00 |
|
Ishaan Jaff
|
e252daaf2b
|
ui - fix latency analytics completion_tokens
|
2024-05-29 09:39:53 -07:00 |
|
Ishaan Jaff
|
5958288ebb
|
fix /global/activity/model
|
2024-05-29 08:58:25 -07:00 |
|
Krish Dholakia
|
e838bd1c79
|
Merge branch 'main' into litellm_batch_completions
|
2024-05-28 22:38:05 -07:00 |
|
Ishaan Jaff
|
6e9cc26ca1
|
Merge pull request #3888 from BerriAI/litellm_add_files_proxy
[Feat] LiteLLM Proxy Add `POST /v1/files` and `GET /v1/files`
|
2024-05-28 22:36:33 -07:00 |
|
Krrish Dholakia
|
1ebae6e7b0
|
fix(router.py): support comma-separated model list for batch completion fastest response
|
2024-05-28 21:34:37 -07:00 |
|
Krrish Dholakia
|
20106715d5
|
feat(proxy_server.py): enable batch completion fastest response calls on proxy
introduces new `fastest_response` flag for enabling the call
|
2024-05-28 20:09:31 -07:00 |
|
Ishaan Jaff
|
37e18a42fc
|
feat - add file endpoints proxy
|
2024-05-28 20:03:13 -07:00 |
|
Krrish Dholakia
|
bffa79a8c5
|
feat(proxy_server.py): give request-level breakdown if ttft metric is selected for ju
st that day
|
2024-05-28 18:09:22 -07:00 |
|
Ishaan Jaff
|
0020672c19
|
fear support GET /v1/batches{batch_id}
|
2024-05-28 17:58:34 -07:00 |
|
Ishaan Jaff
|
c2e24b4ed8
|
feat add v1/batches
|
2024-05-28 17:49:36 -07:00 |
|
Ishaan Jaff
|
37d350b466
|
add batches, files to routes
|
2024-05-28 17:39:17 -07:00 |
|
Ishaan Jaff
|
6a6f163d51
|
ui - mount
|
2024-05-28 08:55:01 -07:00 |
|
Ishaan Jaff
|
e6789a0150
|
fix app mounting ui
|
2024-05-28 08:14:15 -07:00 |
|
Ishaan Jaff
|
83d381bdb2
|
ui - new build, release
|
2024-05-28 08:10:34 -07:00 |
|
Ishaan Jaff
|
cea10e21e7
|
bump: version 1.38.11 → 1.38.12
|
2024-05-28 08:10:34 -07:00 |
|
Ishaan Jaff
|
2069e36d1d
|
Merge pull request #3874 from BerriAI/litellm_make_email_alerts_free
[Feat] Add, Test Email Alerts on Admin UI
|
2024-05-28 08:09:14 -07:00 |
|
Krrish Dholakia
|
eee60b0686
|
fix(proxy_server.py): handle next js route serving on fastapi
|
2024-05-28 01:09:05 -07:00 |
|
Krrish Dholakia
|
293d5cf1f2
|
feat: e2e flow complete - admin can invite new users to proxy via invite links
Completes https://github.com/BerriAI/litellm/issues/3863
|
2024-05-27 23:19:30 -07:00 |
|
Ishaan Jaff
|
24c80e6bc7
|
fix - testing email alerting
|
2024-05-27 22:46:45 -07:00 |
|
Krish Dholakia
|
bcbc250a12
|
Merge branch 'main' into litellm_invite_users_via_link
|
2024-05-27 20:36:43 -07:00 |
|