mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-05 18:27:22 +00:00
feat: Add S3 Files Provider (#3202)
Implements a complete S3-based file storage provider for Llama Stack
with:
Core Implementation:
- S3FilesImpl class with full OpenAI Files API compatibility
- Support for file upload, download, listing, deletion operations
- Sqlite-based metadata storage for fast queries and API compliance
- Configurable S3 endpoints (AWS, MinIO, LocalStack support)
Key Features:
- Automatic S3 bucket creation and management
- Metadata persistence
- Proper error handling for S3 connectivity and permissions
Dependencies:
- Adds boto3 for AWS S3 integration
- Adds moto[s3] for testing infrastructure
Testing:
Unit: `./scripts/unit-tests.sh tests/unit/files
tests/unit/providers/files`
Integration:
Start MinIO: `podman run --rm -it -p 9000:9000 minio/minio server /data`
Start stack w/ S3 provider: `S3_ENDPOINT_URL=http://localhost:9000
AWS_ACCESS_KEY_ID=minioadmin AWS_SECRET_ACCESS_KEY=minioadmin
S3_BUCKET_NAME=llama-stack-files uv run llama stack build --image-type
venv --providers files=remote::s3 --run`
Run integration tests: `./scripts/integration-tests.sh --stack-config
http://localhost:8321 --provider ollama --test-subdirs files`
This commit is contained in:
parent
c5e2e269e2
commit
f520e244d9
11 changed files with 982 additions and 2 deletions
|
|
@ -98,6 +98,7 @@ unit = [
|
|||
"together",
|
||||
"coverage",
|
||||
"chromadb>=1.0.15",
|
||||
"moto[s3]>=5.1.10",
|
||||
]
|
||||
# These are the core dependencies required for running integration tests. They are shared across all
|
||||
# providers. If a provider requires additional dependencies, please add them to your environment
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue