mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-25 10:44:24 +00:00
Fix anthropic prompt caching cost calc + trim logged message in db (#9838)
* fix(spend_tracking_utils.py): prevent logging entire mp4 files to db Fixes https://github.com/BerriAI/litellm/issues/9732 * fix(anthropic/chat/transformation.py): Fix double counting cache creation input tokens Fixes https://github.com/BerriAI/litellm/issues/9812 * refactor(anthropic/chat/transformation.py): refactor streaming to use same usage calculation block as non-streaming reduce errors * fix(bedrock/chat/converse_transformation.py): don't increment prompt tokens with cache_creation_input_tokens * build: remove redisvl from requirements.txt (temporary) * fix(spend_tracking_utils.py): handle circular references * test: update code cov test * test: update test
This commit is contained in:
parent
00c5c23d97
commit
87733c8193
9 changed files with 216 additions and 63 deletions
|
@ -16,6 +16,8 @@ IGNORE_FUNCTIONS = [
|
|||
"_transform_prompt",
|
||||
"mask_dict",
|
||||
"_serialize", # we now set a max depth for this
|
||||
"_sanitize_request_body_for_spend_logs_payload", # testing added for circular reference
|
||||
"_sanitize_value", # testing added for circular reference
|
||||
]
|
||||
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue