mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-25 02:34:29 +00:00
* test_anthropic_cache_control_hook_system_message * test_anthropic_cache_control_hook.py * should_run_prompt_management_hooks * fix should_run_prompt_management_hooks * test_anthropic_cache_control_hook_specific_index * fix test * fix linting errors * ChatCompletionCachedContent * initial commit for cache control * fixes ui design * fix inserting cache_control_injection_points * fix entering cache control points * fixes for using cache control on ui + backend * update cache control settings on edit model page * fix init custom logger compatible class * fix linting errors * fix linting errors * fix get_chat_completion_prompt |
||
---|---|---|
.. | ||
integrations | ||
llms | ||
mcp_server | ||
passthrough_endpoints | ||
proxy | ||
adapter.py | ||
caching.py | ||
completion.py | ||
embedding.py | ||
files.py | ||
fine_tuning.py | ||
guardrails.py | ||
rerank.py | ||
router.py | ||
scheduler.py | ||
services.py | ||
tag_management.py | ||
utils.py |