Commit graph

2445 commits

Author SHA1 Message Date
Krrish Dholakia
32bfb685f5 fix(router.py): cooldown on 404 errors
https://github.com/BerriAI/litellm/issues/3884
2024-05-30 10:57:38 -07:00
Ishaan Jaff
9f189ac91c fix - verbose_proxy_logger debug 2024-05-30 08:09:31 -07:00
Krrish Dholakia
3167bee25a fix(proxy_cli.py): enable json logging via litellm_settings param on config
allows user to enable json logs without needing to figure out env variables
2024-05-29 21:41:20 -07:00
Ishaan Jaff
d451bfbf50 ui new build 2024-05-29 21:11:09 -07:00
Ishaan Jaff
c39db5686d ui - new build 2024-05-29 18:49:22 -07:00
Krish Dholakia
67973e5344
Merge pull request #3910 from BerriAI/litellm_fix_end_user_max_budget
fix(proxy_server.py): fix end user object check when master key used
2024-05-29 18:04:56 -07:00
Ishaan Jaff
b7fcec8835 fix batch tags 2024-05-29 17:51:39 -07:00
Ishaan Jaff
ea30769190
Merge pull request #3897 from BerriAI/litellm_fix_latency_cal
ui  - fix latency analytics on `completion_tokens`
2024-05-29 17:32:17 -07:00
Ishaan Jaff
f9b98c57a2
Merge pull request #3894 from BerriAI/litellm_update_analytics_tab
[Feat] UI update analytics tab to show human friendly usage vals
2024-05-29 17:31:21 -07:00
Krrish Dholakia
cfcf5969c8 fix(proxy_server.py): fix end user object check when master key used
check if end user max budget exceeded for master key
2024-05-29 17:20:59 -07:00
Krish Dholakia
8556f4afb0
Merge pull request #3902 from BerriAI/litellm_openai_compatible_endpoint_fix
fix(openai.py): only allow 'user' as optional param if openai model
2024-05-29 16:34:23 -07:00
Krrish Dholakia
f729370890 feat(proxy_server.py): emit webhook event whenever customer spend is tracked
Closes https://github.com/BerriAI/litellm/issues/3903
2024-05-29 15:59:32 -07:00
Krrish Dholakia
3d32b00821 fix(openai.py): only allow 'user' as optional param if openai model 2024-05-29 15:15:02 -07:00
Krrish Dholakia
8f0019c241 docs(customers.md): add customer cost tracking to docs 2024-05-29 14:55:33 -07:00
Ishaan Jaff
02598ae988 fix(proxy_server.py): fix latency calc for avg output token 2024-05-29 09:49:45 -07:00
Ishaan Jaff
e252daaf2b ui - fix latency analytics completion_tokens 2024-05-29 09:39:53 -07:00
Ishaan Jaff
5958288ebb fix /global/activity/model 2024-05-29 08:58:25 -07:00
Krish Dholakia
e838bd1c79
Merge branch 'main' into litellm_batch_completions 2024-05-28 22:38:05 -07:00
Ishaan Jaff
6e9cc26ca1
Merge pull request #3888 from BerriAI/litellm_add_files_proxy
[Feat] LiteLLM Proxy Add `POST /v1/files` and `GET /v1/files`
2024-05-28 22:36:33 -07:00
Krrish Dholakia
1ebae6e7b0 fix(router.py): support comma-separated model list for batch completion fastest response 2024-05-28 21:34:37 -07:00
Krrish Dholakia
20106715d5 feat(proxy_server.py): enable batch completion fastest response calls on proxy
introduces new `fastest_response` flag for enabling the call
2024-05-28 20:09:31 -07:00
Ishaan Jaff
37e18a42fc feat - add file endpoints proxy 2024-05-28 20:03:13 -07:00
Krrish Dholakia
bffa79a8c5 feat(proxy_server.py): give request-level breakdown if ttft metric is selected for ju
st that day
2024-05-28 18:09:22 -07:00
Ishaan Jaff
0020672c19 fear support GET /v1/batches{batch_id} 2024-05-28 17:58:34 -07:00
Ishaan Jaff
c2e24b4ed8 feat add v1/batches 2024-05-28 17:49:36 -07:00
Ishaan Jaff
37d350b466 add batches, files to routes 2024-05-28 17:39:17 -07:00
Ishaan Jaff
6a6f163d51 ui - mount 2024-05-28 08:55:01 -07:00
Ishaan Jaff
e6789a0150 fix app mounting ui 2024-05-28 08:14:15 -07:00
Ishaan Jaff
83d381bdb2 ui - new build, release 2024-05-28 08:10:34 -07:00
Ishaan Jaff
cea10e21e7 bump: version 1.38.11 → 1.38.12 2024-05-28 08:10:34 -07:00
Ishaan Jaff
2069e36d1d
Merge pull request #3874 from BerriAI/litellm_make_email_alerts_free
[Feat] Add, Test Email Alerts on Admin UI
2024-05-28 08:09:14 -07:00
Krrish Dholakia
eee60b0686 fix(proxy_server.py): handle next js route serving on fastapi 2024-05-28 01:09:05 -07:00
Krrish Dholakia
293d5cf1f2 feat: e2e flow complete - admin can invite new users to proxy via invite links
Completes https://github.com/BerriAI/litellm/issues/3863
2024-05-27 23:19:30 -07:00
Ishaan Jaff
24c80e6bc7 fix - testing email alerting 2024-05-27 22:46:45 -07:00
Krish Dholakia
bcbc250a12
Merge branch 'main' into litellm_invite_users_via_link 2024-05-27 20:36:43 -07:00
Krrish Dholakia
86b66c13a4 feat(proxy_server.py): allow admin to invite users via invite link
Closes https://github.com/BerriAI/litellm/issues/3863
2024-05-27 20:32:25 -07:00
Ishaan Jaff
4b0a8ff1f8
Merge pull request #3872 from BerriAI/litellm_make_email_alerts_free
[Feat] Make Email alerting Free Tier, but customizing emails enterprise
2024-05-27 19:46:03 -07:00
Ishaan Jaff
e1b46d4b6e
Merge pull request #3870 from BerriAI/litellm_rename_end_user
[Feat] Rename `/end/user/new` -> `/customer/new` (maintain backwards compatibility)
2024-05-27 19:42:14 -07:00
Ishaan Jaff
7d5fe910f2 fix - make email alerting free 2024-05-27 19:27:20 -07:00
Ishaan Jaff
0feeb53868 fix - working customer/delete 2024-05-27 19:24:20 -07:00
Krish Dholakia
9c8af6676c
Merge pull request #3869 from BerriAI/litellm_public_model_hubg
Admin UI - Public model hub
2024-05-27 19:14:11 -07:00
Krrish Dholakia
6b50e656b8 fix(main.py): pass extra headers through for async calls 2024-05-27 19:11:40 -07:00
Ishaan Jaff
f588616498 fix - /customer/update 2024-05-27 19:02:20 -07:00
Ishaan Jaff
24f0b82755 feat - add validation for existing customers 2024-05-27 18:29:09 -07:00
Ishaan Jaff
cdf32ebf0e docs string - > end user /new 2024-05-27 18:20:44 -07:00
Ishaan Jaff
9b8c143601 feat - rename end_user -> customer 2024-05-27 18:17:46 -07:00
Krrish Dholakia
4516daec90 feat(model_hub.tsx): enable admin to expose a public model hub 2024-05-27 17:45:45 -07:00
Ishaan Jaff
d71bb96047
Merge pull request #3868 from BerriAI/litellm_show_updated_created_models
[Feat] Show Created at, Created by on `Models` Page
2024-05-27 16:32:29 -07:00
Ishaan Jaff
ef7f3bc4fe backend - add audit logs for adding models 2024-05-27 16:29:51 -07:00
Ishaan Jaff
eccda76edc router - include updated at and created at in model info 2024-05-27 15:53:16 -07:00