mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-03 19:57:35 +00:00
- Add comprehensive Files API documentation with OpenAI-compatible endpoints - Create file operations and vector store integration guide - Add vector store provider docs for FAISS, SQLite-vec, Milvus, ChromaDB, Qdrant, Weaviate, PGVector - Support for release 0.2.14 FileResponse and Vector Store API features - Refactor documentation to focus on OpenAI APIs as primary interface - Remove redundant 'OpenAI-compatible' qualifiers throughout docs - Rename openai_file_operations_vector_stores.md to file_operations_vector_stores.md - Update cross-references and documentation structure |
||
---|---|---|
.. | ||
docs | ||
notebooks | ||
openapi_generator | ||
source | ||
src | ||
static | ||
zero_to_hero_guide | ||
docusaurus.config.ts | ||
dog.jpg | ||
getting_started.ipynb | ||
getting_started_llama4.ipynb | ||
getting_started_llama_api.ipynb | ||
license_header.txt | ||
original_rfc.md | ||
package-lock.json | ||
package.json | ||
quick_start.ipynb | ||
README.md | ||
sidebars.ts | ||
tsconfig.json |
Llama Stack Documentation
Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our Github page.
Render locally
From the llama-stack docs/
directory, run the following commands to render the docs locally:
npm install
npm run gen-api-docs all
npm run build
npm run serve
You can open up the docs in your browser at http://localhost:3000
Content
Try out Llama Stack's capabilities through our detailed Jupyter notebooks:
- Building AI Applications Notebook - A comprehensive guide to building production-ready AI applications using Llama Stack
- Benchmark Evaluations Notebook - Detailed performance evaluations and benchmarking results
- Zero-to-Hero Guide - Step-by-step guide for getting started with Llama Stack