Enforce that all imports from llama-stack-api use the form:
from llama_stack_api import <symbol>
This prevents external code from accessing internal package structure
(e.g., llama_stack_api.agents, llama_stack_api.common.*) and establishes
a clear public API boundary.
Changes:
- Export 400+ symbols from llama_stack_api/__init__.py
- Include all API types, common utilities, and strong_typing helpers
- Update files across src/llama_stack, docs/, tests/, scripts/
- Convert all submodule imports to top-level imports
- ensure docs use the proper importing structure
Addresses PR review feedback requiring explicit __all__ definition to
prevent "peeking inside" the API package.
Signed-off-by: Charlie Doern <cdoern@redhat.com>
move llama_stack_api.apis... to top level llama_stack_api.
merge provider datatypes and the existing apis.datatypes into a common llama_stack_api.datatypes
update all usages of these packages throughout LLS
Signed-off-by: Charlie Doern <cdoern@redhat.com>
Extract API definitions, models, and provider specifications into a
standalone llama-stack-api package that can be published to PyPI
independently of the main llama-stack server.
Motivation
External providers currently import from llama-stack, which overrides
the installed version and causes dependency conflicts. This separation
allows external providers to:
- Install only the type definitions they need without server dependencies
- Avoid version conflicts with the installed llama-stack package
- Be versioned and released independently
This enables us to re-enable external provider module tests that were
previously blocked by these import conflicts.
Changes
- Created llama-stack-api package with minimal dependencies (pydantic, jsonschema)
- Moved APIs, providers datatypes, strong_typing, and schema_utils
- Updated all imports from llama_stack.* to llama_stack_api.*
- Preserved git history using git mv for moved files
- Configured local editable install for development workflow
- Updated linting and type-checking configuration for both packages
- Rebased on top of upstream src/ layout changes
Testing
Package builds successfully and can be imported independently.
All pre-commit hooks pass with expected exclusions maintained.
Next Steps
- Publish llama-stack-api to PyPI
- Update external provider dependencies
- Re-enable external provider module tests
Signed-off-by: Charlie Doern <cdoern@redhat.com>
The llama-stack-client now uses /`v1/openai/v1/models` which returns
OpenAI-compatible model objects with 'id' and 'custom_metadata' fields
instead of the Resource-style 'identifier' field. Updated api_recorder
to handle the new endpoint and modified tests to access model metadata
appropriately. Deleted stale model recordings for re-recording.
**NOTE: CI will be red on this one since it is dependent on
https://github.com/llamastack/llama-stack-client-python/pull/291/files
landing. I verified locally that it is green.**