mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
Implements a complete S3-based file storage provider for Llama Stack
with:
Core Implementation:
- S3FilesImpl class with full OpenAI Files API compatibility
- Support for file upload, download, listing, deletion operations
- Sqlite-based metadata storage for fast queries and API compliance
- Configurable S3 endpoints (AWS, MinIO, LocalStack support)
Key Features:
- Automatic S3 bucket creation and management
- Metadata persistence
- Proper error handling for S3 connectivity and permissions
Dependencies:
- Adds boto3 for AWS S3 integration
- Adds moto[s3] for testing infrastructure
Testing:
Unit: `./scripts/unit-tests.sh tests/unit/files
tests/unit/providers/files`
Integration:
Start MinIO: `podman run --rm -it -p 9000:9000 minio/minio server /data`
Start stack w/ S3 provider: `S3_ENDPOINT_URL=http://localhost:9000
AWS_ACCESS_KEY_ID=minioadmin AWS_SECRET_ACCESS_KEY=minioadmin
S3_BUCKET_NAME=llama-stack-files uv run llama stack build --image-type
venv --providers files=remote::s3 --run`
Run integration tests: `./scripts/integration-tests.sh --stack-config
http://localhost:8321 --provider ollama --test-subdirs files`
|
||
|---|---|---|
| .. | ||
| _static | ||
| notebooks | ||
| openapi_generator | ||
| resources | ||
| source | ||
| zero_to_hero_guide | ||
| conftest.py | ||
| contbuild.sh | ||
| dog.jpg | ||
| getting_started.ipynb | ||
| getting_started_llama4.ipynb | ||
| getting_started_llama_api.ipynb | ||
| license_header.txt | ||
| make.bat | ||
| Makefile | ||
| original_rfc.md | ||
| quick_start.ipynb | ||
| README.md | ||
Llama Stack Documentation
Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our ReadTheDocs page.
Render locally
From the llama-stack root directory, run the following command to render the docs locally:
uv run --group docs sphinx-autobuild docs/source docs/build/html --write-all
You can open up the docs in your browser at http://localhost:8000
Content
Try out Llama Stack's capabilities through our detailed Jupyter notebooks:
- Building AI Applications Notebook - A comprehensive guide to building production-ready AI applications using Llama Stack
- Benchmark Evaluations Notebook - Detailed performance evaluations and benchmarking results
- Zero-to-Hero Guide - Step-by-step guide for getting started with Llama Stack