mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-04 12:07:34 +00:00
# What does this PR do? This PR adds support for the require_approval on an mcp tool definition passed to create response in the Responses API. This allows the caller to indicate whether they want to approve calls to that server, or let them be called without approval. Closes #3443 ## Test Plan Tested both approval and denial. Added automated integration test for both cases. --------- Signed-off-by: Gordon Sim <gsim@redhat.com> Co-authored-by: Matthew Farrellee <matt@cs.wisc.edu> |
||
---|---|---|
.. | ||
docs | ||
notebooks | ||
openapi_generator | ||
src | ||
static | ||
zero_to_hero_guide | ||
docusaurus.config.ts | ||
dog.jpg | ||
getting_started.ipynb | ||
getting_started_llama4.ipynb | ||
getting_started_llama_api.ipynb | ||
license_header.txt | ||
original_rfc.md | ||
package-lock.json | ||
package.json | ||
quick_start.ipynb | ||
README.md | ||
sidebars.ts | ||
tsconfig.json |
Llama Stack Documentation
Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our Github page.
Render locally
From the llama-stack docs/
directory, run the following commands to render the docs locally:
npm install
npm run gen-api-docs all
npm run build
npm run serve
You can open up the docs in your browser at http://localhost:3000
Content
Try out Llama Stack's capabilities through our detailed Jupyter notebooks:
- Building AI Applications Notebook - A comprehensive guide to building production-ready AI applications using Llama Stack
- Benchmark Evaluations Notebook - Detailed performance evaluations and benchmarking results
- Zero-to-Hero Guide - Step-by-step guide for getting started with Llama Stack