mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-26 03:04:13 +00:00
docs add mcp graphic to stable release notes
This commit is contained in:
parent
22f9a93e25
commit
46e5ebe3c7
3 changed files with 10 additions and 1 deletions
BIN
docs/my-website/img/mcp_ui.png
Normal file
BIN
docs/my-website/img/mcp_ui.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 93 KiB |
BIN
docs/my-website/img/release_notes/mcp_ui.png
Normal file
BIN
docs/my-website/img/release_notes/mcp_ui.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 93 KiB |
|
@ -14,17 +14,26 @@ authors:
|
||||||
tags: [mcp]
|
tags: [mcp]
|
||||||
hide_table_of_contents: false
|
hide_table_of_contents: false
|
||||||
---
|
---
|
||||||
|
import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
LiteLLM v1.65.0 introduces significant enhancements including Model Context Protocol (MCP) tools, new models, and various performance improvements.
|
LiteLLM v1.65.0 introduces significant enhancements including Model Context Protocol (MCP) tools, new models, and various performance improvements.
|
||||||
|
|
||||||
## Model Context Protocol (MCP)
|
## Model Context Protocol (MCP)
|
||||||
|
|
||||||
This release introduces support for centrally adding MCP servers on LiteLLM. This allows you to add MCP endpoints and your developers can `list` and `call` MCP tools through LiteLLM.
|
This release introduces support for centrally adding MCP servers on LiteLLM. This allows you to add MCP server endpoints and your developers can `list` and `call` MCP tools through LiteLLM.
|
||||||
|
|
||||||
|
<Image img={require('../../img/release_notes/mcp_ui.png')} />
|
||||||
|
|
||||||
## Custom Prompt Management
|
## Custom Prompt Management
|
||||||
|
|
||||||
This release allows you to connect LiteLLM to any prompt management service through our custom prompt management hooks. As proxy admin all you need to do is implement a `get_chat_completion_prompt` hook which accepts a prompt_id and prompt_variables and returns a formatted prompt.
|
This release allows you to connect LiteLLM to any prompt management service through our custom prompt management hooks. As proxy admin all you need to do is implement a `get_chat_completion_prompt` hook which accepts a prompt_id and prompt_variables and returns a formatted prompt.
|
||||||
|
|
||||||
|
<Image
|
||||||
|
img={require('../../img/custom_prompt_management.png')}
|
||||||
|
style={{width: '100%', display: 'block', margin: '2rem auto'}}
|
||||||
|
/>
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## Categorized Improvements and Fixes
|
## Categorized Improvements and Fixes
|
||||||
### New Models / Updated Models
|
### New Models / Updated Models
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue