docs: make it easier to find anthropic/openai prompt caching doc

This commit is contained in:
Krrish Dholakia 2024-10-13 18:34:09 -07:00
parent 15b44c3221
commit 806a1c4acc
2 changed files with 10 additions and 1 deletions

View file

@ -7,7 +7,10 @@ import TabItem from '@theme/TabItem';
:::info
Need to use Caching on LiteLLM Proxy Server? Doc here: [Caching Proxy Server](https://docs.litellm.ai/docs/proxy/caching)
- For Proxy Server? Doc here: [Caching Proxy Server](https://docs.litellm.ai/docs/proxy/caching)
- For OpenAI/Anthropic Prompt Caching, go [here](../completion/prompt_caching.md)
:::

View file

@ -4,6 +4,12 @@ import TabItem from '@theme/TabItem';
# Caching
Cache LLM Responses
:::note
For OpenAI/Anthropic Prompt Caching, go [here](../completion/prompt_caching.md)
:::
LiteLLM supports:
- In Memory Cache
- Redis Cache