llama-stack-mirror/llama_stack/providers
Matthew Farrellee 8cdfdbe884 feat: Add S3 Files Provider implementation
Implements a complete S3-based file storage provider for Llama Stack with:

Core Implementation:
- S3FilesImpl class with full OpenAI Files API compatibility
- Support for file upload, download, listing, deletion operations
- Sqlite-based metadata storage for fast queries and API compliance
- Configurable S3 endpoints (AWS, MinIO, LocalStack support)

Key Features:
- Automatic S3 bucket creation and management
- Metadata persistence
- Proper error handling for S3 connectivity and permissions

Dependencies:
- Adds boto3 for AWS S3 integration
- Adds moto[s3] for testing infrastructure

Testing:

 Unit: `./scripts/unit-tests.sh tests/unit/files tests/unit/providers/files`

 Integration:

  Start MinIO: `podman run --rm -it -p 9000:9000 minio/minio server /data`

  Start stack w/ S3 provider: `S3_ENDPOINT_URL=http://localhost:9000 AWS_ACCESS_KEY_ID=minioadmin AWS_SECRET_ACCESS_KEY=minioadmin S3_BUCKET_NAME=llama-stack-files uv run llama stack build --image-type venv --providers files=remote::s3 --run`

  Run integration tests: `./scripts/integration-tests.sh --stack-config http://localhost:8321 --provider ollama --test-subdirs files`
2025-08-20 14:23:57 -04:00
..
inline chore(files tests): update files integration tests and fix inline::localfs (#3195) 2025-08-20 14:22:40 -04:00
registry feat: Add S3 Files Provider implementation 2025-08-20 14:23:57 -04:00
remote feat: Add S3 Files Provider implementation 2025-08-20 14:23:57 -04:00
utils chore(pre-commit): add pre-commit hook to enforce llama_stack logger usage (#3061) 2025-08-20 07:15:35 -04:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00