llama-stack/llama_stack/models/llama/llama3/prompt_templates/tool_response.py
Ashwin Bharambe 314ee09ae3
chore: move all Llama Stack types from llama-models to llama-stack (#1098)
llama-models should have extremely minimal cruft. Its sole purpose
should be didactic -- show the simplest implementation of the llama
models and document the prompt formats, etc.

This PR is the complement to
https://github.com/meta-llama/llama-models/pull/279

## Test Plan

Ensure all `llama` CLI `model` sub-commands work:

```bash
llama model list
llama model download --model-id ...
llama model prompt-format -m ...
```

Ran tests:
```bash
cd tests/client-sdk
LLAMA_STACK_CONFIG=fireworks pytest -s -v inference/
LLAMA_STACK_CONFIG=fireworks pytest -s -v vector_io/
LLAMA_STACK_CONFIG=fireworks pytest -s -v agents/
```

Create a fresh venv `uv venv && source .venv/bin/activate` and run
`llama stack build --template fireworks --image-type venv` followed by
`llama stack run together --image-type venv` <-- the server runs

Also checked that the OpenAPI generator can run and there is no change
in the generated files as a result.

```bash
cd docs/openapi_generator
sh run_openapi_generator.sh
```
2025-02-14 09:10:59 -08:00

63 lines
1.8 KiB
Python

# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# top-level folder for each specific model found within the models/ directory at
# the top-level of this source tree.
import textwrap
from typing import Optional
from .base import PromptTemplate, PromptTemplateGeneratorBase
class ToolResponseGenerator(PromptTemplateGeneratorBase):
def gen(
self,
status: str,
stdout: Optional[str] = None,
stderr: Optional[str] = None,
):
assert status in [
"success",
"failure",
], f"status must be 'success' or 'failure'; Got: {status}"
template_str = textwrap.dedent(
"""
{% if status == "success" %}completed{% else %}failed{% endif %}
{%- if stdout %}
[stdout]{{ stdout }}[/stdout]
{%- endif -%}
{%- if stderr %}
[stderr]{{ stderr }}[/stderr]
{%- endif -%}
"""
)
return PromptTemplate(
template_str.lstrip("\n"),
{
"status": status,
"stdout": stdout,
"stderr": stderr,
},
)
def data_examples(self):
return [
# success
{
"status": "success",
"stdout": '{"results":["something something"]}',
},
# failure
{
"status": "failure",
"stderr": "brave_search encounter an error: could not communicate with api.brave.com",
},
]