Fix anthropic prompt caching cost calc + trim logged message in db (#9838)

* fix(spend_tracking_utils.py): prevent logging entire mp4 files to db

Fixes https://github.com/BerriAI/litellm/issues/9732

* fix(anthropic/chat/transformation.py): Fix double counting cache creation input tokens

Fixes https://github.com/BerriAI/litellm/issues/9812

* refactor(anthropic/chat/transformation.py): refactor streaming to use same usage calculation block as non-streaming

reduce errors

* fix(bedrock/chat/converse_transformation.py): don't increment prompt tokens with cache_creation_input_tokens

* build: remove redisvl from requirements.txt (temporary)

* fix(spend_tracking_utils.py): handle circular references

* test: update code cov test

* test: update test
This commit is contained in:
Krish Dholakia 2025-04-09 21:26:43 -07:00 committed by GitHub
parent 00c5c23d97
commit 87733c8193
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
9 changed files with 216 additions and 63 deletions

View file

@ -16,6 +16,8 @@ IGNORE_FUNCTIONS = [
"_transform_prompt",
"mask_dict",
"_serialize", # we now set a max depth for this
"_sanitize_request_body_for_spend_logs_payload", # testing added for circular reference
"_sanitize_value", # testing added for circular reference
]