mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-28 02:53:30 +00:00
# What does this PR do? This expands upon the existing distro_codegen.py text in the new API provider documentation to include a note about not including provider-specific dependencies in the code path that builds the distribution's template. Our distro_codegen pre-commit hook will catch this case anyway, but this attempts to inform provider authors ahead of time about that. ## Test Plan I built the docs website locally via the following: ``` pip install docs/requirements.txt sphinx-build -M html docs/source docs_output ``` Then, I opened that newly generated `docs_output/html/contributing/new_api_provider.html` in my browser and confirmed everything rendered correctly. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
---|---|---|
.. | ||
_static | ||
notebooks | ||
openapi_generator | ||
resources | ||
source | ||
zero_to_hero_guide | ||
conftest.py | ||
contbuild.sh | ||
dog.jpg | ||
getting_started.ipynb | ||
license_header.txt | ||
make.bat | ||
Makefile | ||
readme.md | ||
requirements.txt |
Llama Stack Documentation
Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our ReadTheDocs page.
Content
Try out Llama Stack's capabilities through our detailed Jupyter notebooks:
- Building AI Applications Notebook - A comprehensive guide to building production-ready AI applications using Llama Stack
- Benchmark Evaluations Notebook - Detailed performance evaluations and benchmarking results
- Zero-to-Hero Guide - Step-by-step guide for getting started with Llama Stack