mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-06 18:40:57 +00:00
Some checks failed
Integration Tests (Replay) / generate-matrix (push) Successful in 3s
SqlStore Integration Tests / test-postgres (3.12) (push) Failing after 0s
Integration Auth Tests / test-matrix (oauth2_token) (push) Failing after 1s
SqlStore Integration Tests / test-postgres (3.13) (push) Failing after 0s
Test External Providers Installed via Module / test-external-providers-from-module (venv) (push) Has been skipped
Test Llama Stack Build / generate-matrix (push) Successful in 5s
Python Package Build Test / build (3.12) (push) Failing after 4s
API Conformance Tests / check-schema-compatibility (push) Successful in 12s
Test llama stack list-deps / generate-matrix (push) Successful in 29s
Test Llama Stack Build / build-single-provider (push) Successful in 33s
Test llama stack list-deps / list-deps-from-config (push) Successful in 32s
UI Tests / ui-tests (22) (push) Successful in 39s
Test Llama Stack Build / build (push) Successful in 39s
Test llama stack list-deps / show-single-provider (push) Successful in 46s
Python Package Build Test / build (3.13) (push) Failing after 44s
Test External API and Providers / test-external (venv) (push) Failing after 44s
Vector IO Integration Tests / test-matrix (push) Failing after 56s
Test llama stack list-deps / list-deps (push) Failing after 47s
Unit Tests / unit-tests (3.12) (push) Failing after 1m42s
Unit Tests / unit-tests (3.13) (push) Failing after 1m55s
Test Llama Stack Build / build-ubi9-container-distribution (push) Successful in 2m0s
Test Llama Stack Build / build-custom-container-distribution (push) Successful in 2m2s
Integration Tests (Replay) / Integration Tests (, , , client=, ) (push) Failing after 2m42s
Pre-commit / pre-commit (push) Successful in 5m17s
# What does this PR do? the directory structure was src/llama-stack-api/llama_stack_api instead it should just be src/llama_stack_api to match the other packages. update the structure and pyproject/linting config --------- Signed-off-by: Charlie Doern <cdoern@redhat.com> Co-authored-by: Ashwin Bharambe <ashwin.bharambe@gmail.com>
103 lines
3.1 KiB
Markdown
103 lines
3.1 KiB
Markdown
# llama-stack-api
|
|
|
|
API and Provider specifications for Llama Stack - a lightweight package with protocol definitions and provider specs.
|
|
|
|
## Overview
|
|
|
|
`llama-stack-api` is a minimal dependency package that contains:
|
|
|
|
- **API Protocol Definitions**: Type-safe protocol definitions for all Llama Stack APIs (inference, agents, safety, etc.)
|
|
- **Provider Specifications**: Provider spec definitions for building custom providers
|
|
- **Data Types**: Shared data types and models used across the Llama Stack ecosystem
|
|
- **Type Utilities**: Strong typing utilities and schema validation
|
|
|
|
## What This Package Does NOT Include
|
|
|
|
- Server implementation (see `llama-stack` package)
|
|
- Provider implementations (see `llama-stack` package)
|
|
- CLI tools (see `llama-stack` package)
|
|
- Runtime orchestration (see `llama-stack` package)
|
|
|
|
## Use Cases
|
|
|
|
This package is designed for:
|
|
|
|
1. **Third-party Provider Developers**: Build custom providers without depending on the full Llama Stack server
|
|
2. **Client Library Authors**: Use type definitions without server dependencies
|
|
3. **Documentation Generation**: Generate API docs from protocol definitions
|
|
4. **Type Checking**: Validate implementations against the official specs
|
|
|
|
## Installation
|
|
|
|
```bash
|
|
pip install llama-stack-api
|
|
```
|
|
|
|
Or with uv:
|
|
|
|
```bash
|
|
uv pip install llama-stack-api
|
|
```
|
|
|
|
## Dependencies
|
|
|
|
Minimal dependencies:
|
|
- `pydantic>=2.11.9` - For data validation and serialization
|
|
- `jsonschema` - For JSON schema utilities
|
|
|
|
## Versioning
|
|
|
|
This package follows semantic versioning independently from the main `llama-stack` package:
|
|
|
|
- **Patch versions** (0.1.x): Documentation, internal improvements
|
|
- **Minor versions** (0.x.0): New APIs, backward-compatible changes
|
|
- **Major versions** (x.0.0): Breaking changes to existing APIs
|
|
|
|
Current version: **0.4.0.dev0**
|
|
|
|
## Usage Example
|
|
|
|
```python
|
|
from llama_stack_api.inference import Inference, ChatCompletionRequest
|
|
from llama_stack_api.providers.datatypes import ProviderSpec, InlineProviderSpec
|
|
from llama_stack_api.datatypes import Api
|
|
|
|
|
|
# Use protocol definitions for type checking
|
|
class MyInferenceProvider(Inference):
|
|
async def chat_completion(self, request: ChatCompletionRequest):
|
|
# Your implementation
|
|
pass
|
|
|
|
|
|
# Define provider specifications
|
|
my_provider_spec = InlineProviderSpec(
|
|
api=Api.inference,
|
|
provider_type="inline::my-provider",
|
|
pip_packages=["my-dependencies"],
|
|
module="my_package.providers.inference",
|
|
config_class="my_package.providers.inference.MyConfig",
|
|
)
|
|
```
|
|
|
|
## Relationship to llama-stack
|
|
|
|
The main `llama-stack` package depends on `llama-stack-api` and provides:
|
|
- Full server implementation
|
|
- Built-in provider implementations
|
|
- CLI tools for running and managing stacks
|
|
- Runtime provider resolution and orchestration
|
|
|
|
## Contributing
|
|
|
|
See the main [Llama Stack repository](https://github.com/llamastack/llama-stack) for contribution guidelines.
|
|
|
|
## License
|
|
|
|
MIT License - see LICENSE file for details.
|
|
|
|
## Links
|
|
|
|
- [Main Llama Stack Repository](https://github.com/llamastack/llama-stack)
|
|
- [Documentation](https://llamastack.ai/)
|
|
- [Client Library](https://pypi.org/project/llama-stack-client/)
|