litellm-mirror/ui
Krish Dholakia 08244aca0e fix(proxy_server.py): fix get model info when litellm_model_id is set + move model analytics to free (#7886)
* fix(proxy_server.py): fix get model info when litellm_model_id is set

Fixes https://github.com/BerriAI/litellm/issues/7873

* test(test_models.py): add test to ensure get model info on specific deployment has same value as all model info

Fixes https://github.com/BerriAI/litellm/issues/7873

* fix(usage.tsx): make model analytics free

Fixes @iqballx's feedback

* fix(fix(invoke_handler.py):-fix-bedrock-error-chunk-parsing): return correct bedrock status code and error message if chunk in stream

Improves bedrock stream error handling

* fix(proxy_server.py): fix linting errors

* test(test_auth_checks.py): remove redundant test

* fix(proxy_server.py): fix linting errors

* test: fix flaky test

* test: fix test
2025-01-21 08:19:07 -08:00
..
litellm-dashboard fix(proxy_server.py): fix get model info when litellm_model_id is set + move model analytics to free (#7886) 2025-01-21 08:19:07 -08:00
package-lock.json build(deps): bump nanoid from 3.3.7 to 3.3.8 in /ui (#7198) 2024-12-12 12:04:54 -08:00
package.json build(ui/litellm-dashboard): initial commit of litellm dashboard 2024-01-27 12:12:48 -08:00