fix(proxy_server.py): fix get model info when litellm_model_id is set + move model analytics to free (#7886)

* fix(proxy_server.py): fix get model info when litellm_model_id is set

Fixes https://github.com/BerriAI/litellm/issues/7873

* test(test_models.py): add test to ensure get model info on specific deployment has same value as all model info

Fixes https://github.com/BerriAI/litellm/issues/7873

* fix(usage.tsx): make model analytics free

Fixes @iqballx's feedback

* fix(fix(invoke_handler.py):-fix-bedrock-error-chunk-parsing): return correct bedrock status code and error message if chunk in stream

Improves bedrock stream error handling

* fix(proxy_server.py): fix linting errors

* test(test_auth_checks.py): remove redundant test

* fix(proxy_server.py): fix linting errors

* test: fix flaky test

* test: fix test
This commit is contained in:
Krish Dholakia 2025-01-21 08:19:07 -08:00 committed by GitHub
parent 0295f494b6
commit c8aa876785
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
8 changed files with 146 additions and 131 deletions

View file

@ -432,6 +432,7 @@ class Huggingface(BaseLLM):
embed_url: str,
) -> dict:
data: Dict = {}
## TRANSFORMATION ##
if "sentence-transformers" in model:
if len(input) == 0: