diff --git a/docs/my-website/docs/tutorials/prompt_caching.md b/docs/my-website/docs/tutorials/prompt_caching.md index 7246788234..bf3d5a8dda 100644 --- a/docs/my-website/docs/tutorials/prompt_caching.md +++ b/docs/my-website/docs/tutorials/prompt_caching.md @@ -2,7 +2,6 @@ import Image from '@theme/IdealImage'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; - # Auto-Inject Prompt Caching Checkpoints Reduce costs by up to 90% by using LiteLLM to auto-inject prompt caching checkpoints. @@ -14,9 +13,8 @@ Reduce costs by up to 90% by using LiteLLM to auto-inject prompt caching checkpo LiteLLM can automatically inject prompt caching checkpoints into your requests to LLM providers. This allows: -- **Cached Processing**: Long, static parts of your prompts can be cached to avoid repeated processing -- **Cost Reduction**: Only process the dynamic parts of your prompts, significantly reducing API costs -- **Seamless Integration**: No need to modify your application code +- **Cost Reduction**: Long, static parts of your prompts can be cached to avoid repeated processing +- **No need to modify your application code**: You can configure the auto-caching behavior in the LiteLLM UI or in the `litellm config.yaml` file. ## Configuration diff --git a/docs/my-website/img/ui_auto_prompt_caching.png b/docs/my-website/img/ui_auto_prompt_caching.png index 432deea5f7..e6f48e48d0 100644 Binary files a/docs/my-website/img/ui_auto_prompt_caching.png and b/docs/my-website/img/ui_auto_prompt_caching.png differ