forked from phoenix-oss/llama-stack-mirror
The `--image-name __system__` thing was a hack and a bad one at that. The actual intent was to somehow automatically detect the notebook environment so we could avoid unnecessarily confusing things in the llama stack build cmd-line. But I failed which led us to use the backup `__system__` thing. Let's just do the simple thing. Note that `build_venv.sh` I haven't changed for now (so it still honors the __system__ special name just that no new user should use it.) ## Test Plan Open the notebooks from this branch in Colab (see example url below) and ensure the builds work. https://colab.research.google.com/github/meta-llama/llama-stack/blob/foo/docs/getting_started.ipynb In the notebook, install llama-stack from this branch directly using: ``` !pip install -U https://github.com/meta-llama/llama-stack/archive/refs/heads/foo.zip ``` Verify that `!UV_SYSTEM_PYTHON=1 llama stack build --template together --image-type venv` afterwards succeeds and the library client initialization also works. |
||
---|---|---|
.. | ||
_static | ||
notebooks | ||
openapi_generator | ||
resources | ||
source | ||
zero_to_hero_guide | ||
conftest.py | ||
contbuild.sh | ||
dog.jpg | ||
getting_started.ipynb | ||
license_header.txt | ||
make.bat | ||
Makefile | ||
readme.md | ||
requirements.txt |
Llama Stack Documentation
Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our ReadTheDocs page.
Content
Try out Llama Stack's capabilities through our detailed Jupyter notebooks:
- Building AI Applications Notebook - A comprehensive guide to building production-ready AI applications using Llama Stack
- Benchmark Evaluations Notebook - Detailed performance evaluations and benchmarking results
- Zero-to-Hero Guide - Step-by-step guide for getting started with Llama Stack