litellm/ui
Krish Dholakia b0be5bf3a1
LiteLLM Minor Fixes & Improvements (11/19/2024) (#6820)
* fix(anthropic/chat/transformation.py): add json schema as values: json_schema

fixes passing pydantic obj to anthropic

Fixes https://github.com/BerriAI/litellm/issues/6766

* (feat): Add timestamp_granularities parameter to transcription API (#6457)

* Add timestamp_granularities parameter to transcription API

* add param to the local test

* fix(databricks/chat.py): handle max_retries optional param handling for openai-like calls

Fixes issue with calling finetuned vertex ai models via databricks route

* build(ui/): add team admins via proxy ui

* fix: fix linting error

* test: fix test

* docs(vertex.md): refactor docs

* test: handle overloaded anthropic model error

* test: remove duplicate test

* test: fix test

* test: update test to handle model overloaded error

---------

Co-authored-by: Show <35062952+BrunooShow@users.noreply.github.com>
2024-11-21 00:57:58 +05:30
..
litellm-dashboard LiteLLM Minor Fixes & Improvements (11/19/2024) (#6820) 2024-11-21 00:57:58 +05:30
pages Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
admin.py (ui) spend per user 2024-01-24 12:10:35 -08:00
Dockerfile build(ui/Dockerfile): point to new litellm-dashboard for self hosted ui 2024-01-29 16:43:29 -08:00
package-lock.json Bump cross-spawn from 7.0.3 to 7.0.5 in /ui (#6779) 2024-11-18 14:07:44 -08:00
package.json build(ui/litellm-dashboard): initial commit of litellm dashboard 2024-01-27 12:12:48 -08:00
README.md Update README.md 2024-02-03 20:54:14 -08:00
requirements.txt (fix) UI - requirements.txt 2024-01-24 08:17:05 -08:00

proxy ui

Create Proxy Keys, Track Spend per key

👉 UI is available on /ui on your Proxy. docs

ui_3