litellm/ui
Krish Dholakia 3933fba41f
LiteLLM Minor Fixes & Improvements (09/19/2024) (#5793)
* fix(model_prices_and_context_window.json): add cost tracking for more vertex llama3.1 model

8b and 70b models

* fix(proxy/utils.py): handle data being none on pre-call hooks

* fix(proxy/): create views on initial proxy startup

fixes base case, where user starts proxy for first time

 Fixes https://github.com/BerriAI/litellm/issues/5756

* build(config.yml): fix vertex version for test

* feat(ui/): support enabling/disabling slack alerting

Allows admin to turn on/off slack alerting through ui

* feat(rerank/main.py): support langfuse logging

* fix(proxy/utils.py): fix linting errors

* fix(langfuse.py): log clean metadata

* test(tests): replace deprecated openai model
2024-09-20 08:19:52 -07:00
..
litellm-dashboard LiteLLM Minor Fixes & Improvements (09/19/2024) (#5793) 2024-09-20 08:19:52 -07:00
pages docs(ui.md): add docs on self serve ui flow 2024-01-01 18:25:52 +05:30
admin.py (ui) spend per user 2024-01-24 12:10:35 -08:00
Dockerfile build(ui/Dockerfile): point to new litellm-dashboard for self hosted ui 2024-01-29 16:43:29 -08:00
package-lock.json build(deps): bump micromatch from 4.0.5 to 4.0.8 in /ui 2024-08-24 01:31:29 +00:00
package.json build(ui/litellm-dashboard): initial commit of litellm dashboard 2024-01-27 12:12:48 -08:00
README.md Update README.md 2024-02-03 20:54:14 -08:00
requirements.txt (fix) UI - requirements.txt 2024-01-24 08:17:05 -08:00

proxy ui

Create Proxy Keys, Track Spend per key

👉 UI is available on /ui on your Proxy. docs

ui_3