litellm-mirror/litellm/proxy/management_endpoints
Krish Dholakia ec36353b41
fix(main.py): fix retries being multiplied when using openai sdk (#7221)
* fix(main.py): fix retries being multiplied when using openai sdk

Closes https://github.com/BerriAI/litellm/pull/7130

* docs(prompt_management.md): add langfuse prompt management doc

* feat(team_endpoints.py): allow teams to add their own models

Enables teams to call their own finetuned models via the proxy

* test: add better enforcement check testing for `/model/new` now that teams can add their own models

* docs(team_model_add.md): tutorial for allowing teams to add their own models

* test: fix test
2024-12-14 11:56:55 -08:00
..
customer_endpoints.py LiteLLM Minor Fixes & Improvements (12/05/2024) (#7051) 2024-12-06 14:29:53 -08:00
internal_user_endpoints.py fix viewing keys (#7042) 2024-12-05 08:01:09 -08:00
key_management_endpoints.py fix(key_management_endpoints.py): override metadata field value on up… (#7008) 2024-12-03 23:03:50 -08:00
organization_endpoints.py LiteLLM Minor Fixes & Improvements (11/23/2024) (#6870) 2024-11-23 15:17:40 +05:30
sso_helper_utils.py LiteLLM Minor Fixes & Improvements (10/17/2024) (#6293) 2024-10-17 22:09:11 -07:00
team_callback_endpoints.py (QOL improvement) Provider budget routing - allow using 1s, 1d, 1mo, 2mo etc (#6885) 2024-11-23 16:59:46 -08:00
team_endpoints.py fix(main.py): fix retries being multiplied when using openai sdk (#7221) 2024-12-14 11:56:55 -08:00
ui_sso.py (feat) UI - Disable Usage Tab once SpendLogs is 1M+ Rows (#7208) 2024-12-12 18:43:17 -08:00