Litellm dev 01 14 2025 p2 (#7772)

* feat(pass_through_endpoints.py): fix anthropic end user cost tracking

* fix(anthropic/chat/transformation.py): use returned provider model for anthropic

handles anthropic `-latest` tag in request body throwing cost calculation errors

ensures we can be accurate in our model cost tracking

* feat(model_prices_and_context_window.json): add gemini-2.0-flash-thinking-exp pricing

* test: update test to use assumption that user_api_key_dict can get anthropic user id

* test: fix test

* fix: fix test

* fix(anthropic_pass_through.py): uncomment previous anthropic end-user cost tracking code block

can't guarantee user api key dict always has end user id - too many code paths

* fix(user_api_key_auth.py): this allows end user id from request body to always be read and set in auth object

* fix(auth_check.py): fix linting error

* test: fix auth check

* fix(auth_utils.py): fix get end user id to handle metadata = None
This commit is contained in:
Krish Dholakia 2025-01-15 21:34:50 -08:00 committed by GitHub
parent 73c004cfe5
commit 543655adc7
16 changed files with 287 additions and 43 deletions

View file

@ -464,6 +464,7 @@ def should_run_auth_on_pass_through_provider_route(route: str) -> bool:
from litellm.proxy.proxy_server import general_settings, premium_user
if premium_user is not True:
return False
# premium use has opted into using client credentials
@ -493,3 +494,17 @@ def _has_user_setup_sso():
)
return sso_setup
def get_end_user_id_from_request_body(request_body: dict) -> Optional[str]:
# openai - check 'user'
if "user" in request_body:
return request_body["user"]
# anthropic - check 'litellm_metadata'
end_user_id = request_body.get("litellm_metadata", {}).get("user", None)
if end_user_id:
return end_user_id
metadata = request_body.get("metadata")
if metadata and "user_id" in metadata:
return metadata["user_id"]
return None