litellm-mirror/litellm/proxy/management_endpoints
Krish Dholakia 6729c9ca7f
LiteLLM Minor Fixes & Improvements (10/07/2024) (#6101)
* fix(utils.py): support dropping temperature param for azure o1 models

* fix(main.py): handle azure o1 streaming requests

o1 doesn't support streaming, fake it to ensure code works as expected

* feat(utils.py): expose `hosted_vllm/` endpoint, with tool handling for vllm

Fixes https://github.com/BerriAI/litellm/issues/6088

* refactor(internal_user_endpoints.py): cleanup unused params + update docstring

Closes https://github.com/BerriAI/litellm/issues/6100

* fix(main.py): expose custom image generation api support

Fixes https://github.com/BerriAI/litellm/issues/6097

* fix: fix linting errors

* docs(custom_llm_server.md): add docs on custom api for image gen calls

* fix(types/utils.py): handle dict type

* fix(types/utils.py): fix linting errors
2024-10-07 22:17:22 -07:00
..
internal_user_endpoints.py LiteLLM Minor Fixes & Improvements (10/07/2024) (#6101) 2024-10-07 22:17:22 -07:00
key_management_endpoints.py correct use of healthy / unhealthy 2024-10-06 13:48:30 +05:30
team_callback_endpoints.py add endpoint to disable logging for a team 2024-07-23 08:38:23 -07:00
team_endpoints.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
ui_sso.py (proxy ui sso flow) - fix invite user sso flow (#6093) 2024-10-07 12:32:08 +05:30