mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-28 02:53:30 +00:00
docs(mcp): add a few lines for how to specify Auth headers in MCP tools (#2336)
This commit is contained in:
parent
6bb174bb05
commit
76dcf47320
7 changed files with 134 additions and 102 deletions
|
@ -9,29 +9,24 @@ When instantiating an agent, you can provide it a list of tool groups that it ha
|
|||
|
||||
Refer to the [Building AI Applications](https://github.com/meta-llama/llama-stack/blob/main/docs/getting_started.ipynb) notebook for more examples on how to use tools.
|
||||
|
||||
## Types of Tool Group providers
|
||||
## Server-side vs. client-side tool execution
|
||||
|
||||
There are three types of providers for tool groups that are supported by Llama Stack.
|
||||
Llama Stack allows you to use both server-side and client-side tools. With server-side tools, `agent.create_turn` can perform execution of the tool calls emitted by the model
|
||||
transparently giving the user the final answer desired. If client-side tools are provided, the tool call is sent back to the user for execution
|
||||
and optional continuation using the `agent.resume_turn` method.
|
||||
|
||||
1. Built-in providers
|
||||
2. Model Context Protocol (MCP) providers
|
||||
3. Client provided tools
|
||||
|
||||
### Built-in providers
|
||||
### Server-side tools
|
||||
|
||||
Built-in providers come packaged with Llama Stack. These providers provide common functionalities like web search, code interpretation, and computational capabilities.
|
||||
Llama Stack provides built-in providers for some common tools. These include web search, math, and RAG capabilities.
|
||||
|
||||
#### Web Search providers
|
||||
There are three web search providers that are supported by Llama Stack.
|
||||
#### Web Search
|
||||
|
||||
1. Brave Search
|
||||
2. Bing Search
|
||||
3. Tavily Search
|
||||
You have three providers to execute the web search tool calls generated by a model: Brave Search, Bing Search, and Tavily Search.
|
||||
|
||||
Example client SDK call to register a "websearch" toolgroup that is provided by brave-search.
|
||||
To indicate that the web search tool calls should be executed by brave-search, you can point the "builtin::websearch" toolgroup to the "brave-search" provider.
|
||||
|
||||
```python
|
||||
# Register Brave Search tool group
|
||||
client.toolgroups.register(
|
||||
toolgroup_id="builtin::websearch",
|
||||
provider_id="brave-search",
|
||||
|
@ -39,17 +34,17 @@ client.toolgroups.register(
|
|||
)
|
||||
```
|
||||
|
||||
The tool requires an API key which can be provided either in the configuration or through the request header `X-LlamaStack-Provider-Data`. The format of the header is `{"<provider_name>_api_key": <your api key>}`.
|
||||
|
||||
> **NOTE:** When using Tavily Search and Bing Search, the inference output will still display "Brave Search." This is because Llama models have been trained with Brave Search as a built-in tool. Tavily and bing is just being used in lieu of Brave search.
|
||||
The tool requires an API key which can be provided either in the configuration or through the request header `X-LlamaStack-Provider-Data`. The format of the header is:
|
||||
```
|
||||
{"<provider_name>_api_key": <your api key>}
|
||||
```
|
||||
|
||||
|
||||
#### WolframAlpha
|
||||
#### Math
|
||||
|
||||
The WolframAlpha tool provides access to computational knowledge through the WolframAlpha API.
|
||||
|
||||
```python
|
||||
# Register WolframAlpha tool group
|
||||
client.toolgroups.register(
|
||||
toolgroup_id="builtin::wolfram_alpha", provider_id="wolfram-alpha"
|
||||
)
|
||||
|
@ -83,11 +78,49 @@ Features:
|
|||
|
||||
> **Note:** By default, llama stack run.yaml defines toolgroups for web search, wolfram alpha and rag, that are provided by tavily-search, wolfram-alpha and rag providers.
|
||||
|
||||
## Model Context Protocol (MCP) Tools
|
||||
## Model Context Protocol (MCP)
|
||||
|
||||
MCP tools are special tools that can interact with llama stack over model context protocol. These tools are dynamically discovered from an MCP endpoint and can be used to extend the agent's capabilities.
|
||||
[MCP](https://github.com/modelcontextprotocol) is an upcoming, popular standard for tool discovery and execution. It is a protocol that allows tools to be dynamically discovered
|
||||
from an MCP endpoint and can be used to extend the agent's capabilities.
|
||||
|
||||
Refer to [https://github.com/modelcontextprotocol/servers](https://github.com/modelcontextprotocol/servers) for available MCP servers.
|
||||
|
||||
### Using Remote MCP Servers
|
||||
|
||||
You can find some popular remote MCP servers [here](https://github.com/jaw9c/awesome-remote-mcp-servers). You can register them as toolgroups in the same way as local providers.
|
||||
|
||||
```python
|
||||
client.toolgroups.register(
|
||||
toolgroup_id="mcp::deepwiki",
|
||||
provider_id="model-context-protocol",
|
||||
mcp_endpoint=URL(uri="https://mcp.deepwiki.com/sse"),
|
||||
)
|
||||
```
|
||||
|
||||
Note that most of the more useful MCP servers need you to authenticate with them. Many of them use OAuth2.0 for authentication. You can provide authorization headers to send to the MCP server
|
||||
using the "Provider Data" abstraction provided by Llama Stack. When making an agent call,
|
||||
|
||||
```python
|
||||
agent = Agent(
|
||||
...,
|
||||
tools=["mcp::deepwiki"],
|
||||
extra_headers={
|
||||
"X-LlamaStack-Provider-Data": json.dumps(
|
||||
{
|
||||
"mcp_headers": {
|
||||
"http://mcp.deepwiki.com/sse": {
|
||||
"Authorization": "Bearer <your_access_token>",
|
||||
},
|
||||
},
|
||||
}
|
||||
),
|
||||
},
|
||||
)
|
||||
agent.create_turn(...)
|
||||
```
|
||||
|
||||
### Running your own MCP server
|
||||
|
||||
Here's an example of how to run a simple MCP server that exposes a File System as a set of tools to the Llama Stack agent.
|
||||
|
||||
```shell
|
||||
# start your MCP server
|
||||
|
@ -106,13 +139,9 @@ client.toolgroups.register(
|
|||
)
|
||||
```
|
||||
|
||||
MCP tools require:
|
||||
- A valid MCP endpoint URL
|
||||
- The endpoint must implement the Model Context Protocol
|
||||
- Tools are discovered dynamically from the endpoint
|
||||
|
||||
|
||||
## Adding Custom Tools
|
||||
## Adding Custom (Client-side) Tools
|
||||
|
||||
When you want to use tools other than the built-in tools, you just need to implement a python function with a docstring. The content of the docstring will be used to describe the tool and the parameters and passed
|
||||
along to the generative model.
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue