Commit graph

1251 commits

Author SHA1 Message Date
Krrish Dholakia
d3ffb22773 fix(proxy_server.py): allow 'upperbound_key_generate' params to be set via 'os.environ/' 2024-04-09 07:48:29 -07:00
Krrish Dholakia
57b4536159 fix(proxy_server.py): raise exception if max user value > max value set in config 2024-04-09 07:34:34 -07:00
Ishaan Jaff
31e0563036 Merge pull request #2912 from BerriAI/litellm_ui_qa
[UI] QA Improvements to adding callbacks, models
2024-04-09 07:34:05 -07:00
Ishaan Jaff
37f565ccc4 instantly add deployment on /model/new 2024-04-09 07:32:24 -07:00
Krrish Dholakia
4521112aeb fix(proxy_server.py): allow jwt public key ttl to be passed as os.environ 2024-04-09 07:23:13 -07:00
Ishaan Jaff
203c22c50f fix - ui qa 2024-04-09 07:19:27 -07:00
Krish Dholakia
d81582a2c6 Merge pull request #2907 from BerriAI/litellm_map_user_to_org
fix(proxy_server.py): allow mapping a user to an org
2024-04-08 20:50:47 -07:00
Krrish Dholakia
da216c6915 fix(proxy_server.py): allow mapping a user to an org 2024-04-08 20:45:11 -07:00
Ishaan Jaff
dd7ee63631 ui - new build 2024-04-08 20:43:28 -07:00
Ishaan Jaff
a3a21a1e8d feat - add slack webhooks 2024-04-08 20:11:48 -07:00
Ishaan Jaff
a48e79a6f7 dynamically set litellm callbacks 2024-04-08 18:13:13 -07:00
Krrish Dholakia
e264c88462 build(ui): fix navbar 2024-04-08 17:51:17 -07:00
Krrish Dholakia
54959cd234 fix(proxy_server.py): add more detailed debugging 2024-04-08 17:37:55 -07:00
Ishaan Jaff
02407c684c Merge pull request #2903 from BerriAI/litellm_set_new_callbacks_ui
ui - add langfuse modal
2024-04-08 17:37:38 -07:00
Ishaan Jaff
0c5e3d18fb ui - set langfuse callbacks 2024-04-08 17:32:20 -07:00
Krrish Dholakia
4ef4fb80f8 fix(proxy_server.py): allow for no models in model_list - all models passed in via /model/new 2024-04-08 16:17:52 -07:00
Ishaan Jaff
491cabbe94 Merge pull request #2902 from BerriAI/litellm_ui_set_get_callbacks
UI view set callbacks
2024-04-08 16:08:14 -07:00
Ishaan Jaff
6b4c6b3d58 proxy - add callbacks 2024-04-08 15:59:04 -07:00
Ishaan Jaff
4f64f338e2 ui - return callbacks set on litellm 2024-04-08 15:25:11 -07:00
Krrish Dholakia
20991b31a2 fix(proxy_server.py): add info statement 2024-04-08 15:12:56 -07:00
Krrish Dholakia
187d53b305 docs(demo.md): fix iframe link 2024-04-08 15:11:19 -07:00
Ishaan Jaff
b1a07cb023 feat - write success callbacks to db 2024-04-08 15:03:18 -07:00
Krrish Dholakia
c15086073f fix(proxy_server.py): allow /model/new feature flag to work via env 2024-04-08 14:57:19 -07:00
Krrish Dholakia
0dad78b53c feat(proxy/utils.py): return api base for request hanging alerts 2024-04-06 15:58:53 -07:00
Krrish Dholakia
b09960b750 feat(prisma.schema): allow setting user_alias 2024-04-05 17:04:14 -07:00
Krrish Dholakia
3401c9c120 fix(proxy_server.py): remove original response from headers - causing nginx to fail
response headers > 4kb cause nginx to fail
2024-04-05 12:36:14 -07:00
Krish Dholakia
60f7168d0b Merge branch 'main' into litellm_add_model_api_fix 2024-04-04 19:43:52 -07:00
Krrish Dholakia
16ef3afe7d fix(proxy_server.py): skip to next model if invalid model (raise logger.error) 2024-04-04 19:22:17 -07:00
Krrish Dholakia
5bbc2f4985 fix(proxy_server.py): fix linting issue 2024-04-04 19:15:57 -07:00
Krrish Dholakia
ece37a4b7f feat(ui): add models via ui
adds ability to add models via ui to the proxy. also fixes additional bugs around new /model/new endpoint
2024-04-04 18:56:20 -07:00
Ishaan Jaff
1c51e17205 ui show spend per tag 2024-04-04 16:57:45 -07:00
Ishaan Jaff
a613d5183a Merge pull request #2840 from BerriAI/litellm_return_cache_key_responses
[FEAT] Proxy - Delete Cache Keys + return cache key in responses
2024-04-04 11:52:52 -07:00
Ishaan Jaff
f8fe8c787a return cache key in streming responses 2024-04-04 11:00:00 -07:00
Ishaan Jaff
a515441ad1 feat - delete cache key 2024-04-04 10:56:47 -07:00
Krrish Dholakia
0716526c53 fix(proxy_server.py): fix linting error 2024-04-04 10:40:32 -07:00
Ishaan Jaff
d0977d61ff v0 return cache key in responses 2024-04-04 10:11:18 -07:00
Krrish Dholakia
1216e1c3c3 test(test_models.py): fix delete model test 2024-04-04 08:46:08 -07:00
Krrish Dholakia
ec29f44c1f fix: raise correct error 2024-04-03 22:37:51 -07:00
Krish Dholakia
b01e1c24df Merge branch 'main' into litellm_model_add_api 2024-04-03 20:29:44 -07:00
Krrish Dholakia
129bb52e9d fix(proxy_server.py): persist models added via /model/new to db
allows models to be used across instances

https://github.com/BerriAI/litellm/issues/2319 , https://github.com/BerriAI/litellm/issues/2329
2024-04-03 20:16:41 -07:00
Ishaan Jaff
13c1130d90 fix team update bug 2024-04-03 19:38:07 -07:00
Krrish Dholakia
f3fcfe3007 fix(proxy_server.py): return original model response via response headers - /v1/completions
to help devs with debugging
2024-04-03 13:05:43 -07:00
Krrish Dholakia
71b8a60d14 fix(proxy_server.py): support calling public endpoints when jwt_auth is enabled 2024-04-03 07:56:53 -07:00
Ishaan Jaff
a46179947e fix safe use token id 2024-04-02 21:40:35 -07:00
Ishaan Jaff
734c4fba3b v0 use token_in /key_generate 2024-04-02 21:31:24 -07:00
Ishaan Jaff
ec1ba2e216 proxy test all-tea-models 2024-04-02 20:50:47 -07:00
Krrish Dholakia
029ee15951 perf(proxy_server.py): batch write spend logs
reduces prisma client errors, by batch writing spend logs - max 1k logs at a time
2024-04-02 18:46:55 -07:00
Ishaan Jaff
6534f887cc support all-proxy-models for teams 2024-04-02 16:04:09 -07:00
Ishaan Jaff
857a7c9135 support all-models-on-proxy 2024-04-02 15:52:54 -07:00
Ishaan Jaff
78edb29a29 (fix) support all-models alias on backend 2024-04-02 15:12:37 -07:00