mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-16 01:53:10 +00:00
# What does this PR do? closes https://github.com/meta-llama/llama-stack/issues/1586 this issue arises when loading an mcp_endpoint from run.yaml. the issue does not manifest for mcp servers added via a running distro server. the existing tests only cover the case of adding to a running server. the code for loading run.yaml strips type information from mcp_endpoint, passing `{"uri": ...}` instead of `URL(uri=...)` along to the resource provider registration. ## Test Plan 1. run an mcp server 2. add an mcp tool config to the dev.py, e.g. ``` diff --git a/llama_stack/templates/dev/dev.py b/llama_stack/templates/dev/dev.py index 69924acb..e0dc7189 100644 --- a/llama_stack/templates/dev/dev.py +++ b/llama_stack/templates/dev/dev.py @@ -6,6 +6,8 @@ from typing import List, Tuple +from llama_stack.apis.common.content_types import URL + from llama_stack.apis.models.models import ModelType from llama_stack.distribution.datatypes import ( ModelInput, @@ -154,6 +156,11 @@ def get_distribution_template() -> DistributionTemplate: toolgroup_id="builtin::code_interpreter", provider_id="code-interpreter", ), + ToolGroupInput( + toolgroup_id="mcp::filesystem", + provider_id="model-context-protocol", + mcp_endpoint=URL(uri="http://localhost:8002/sse"), + ), ] embedding_model = ModelInput( model_id="all-MiniLM-L6-v2", ``` 3. run distro_codegen.py 4. llama stack build --template dev --run before this pr, the `llama stack run` would fail w/ `AttributeError: 'dict' object has no attribute 'uri'`, after it will succeed. |
||
---|---|---|
.. | ||
routers | ||
server | ||
store | ||
ui | ||
utils | ||
__init__.py | ||
access_control.py | ||
build.py | ||
build_conda_env.sh | ||
build_container.sh | ||
build_venv.sh | ||
client.py | ||
common.sh | ||
configure.py | ||
datatypes.py | ||
distribution.py | ||
inspect.py | ||
library_client.py | ||
providers.py | ||
request_headers.py | ||
resolver.py | ||
stack.py | ||
start_stack.sh |