feat: split API and provider specs into separate llama-stack-api pkg

Extract API definitions, models, and provider specifications into a
standalone llama-stack-api package that can be published to PyPI
independently of the main llama-stack server.

Motivation

External providers currently import from llama-stack, which overrides
the installed version and causes dependency conflicts. This separation
allows external providers to:

- Install only the type definitions they need without server dependencies
- Avoid version conflicts with the installed llama-stack package
- Be versioned and released independently

This enables us to re-enable external provider module tests that were
previously blocked by these import conflicts.

Changes

- Created llama-stack-api package with minimal dependencies (pydantic, jsonschema)
- Moved APIs, providers datatypes, strong_typing, and schema_utils
- Updated all imports from llama_stack.* to llama_stack_api.*
- Preserved git history using git mv for moved files
- Configured local editable install for development workflow
- Updated linting and type-checking configuration for both packages
- Rebased on top of upstream src/ layout changes

Testing

Package builds successfully and can be imported independently.
All pre-commit hooks pass with expected exclusions maintained.

Next Steps

- Publish llama-stack-api to PyPI
- Update external provider dependencies
- Re-enable external provider module tests

Signed-off-by: Charlie Doern <cdoern@redhat.com>
This commit is contained in:
Charlie Doern 2025-10-30 12:25:23 -04:00
parent e5a55f3677
commit 85d407c2a0
359 changed files with 1259 additions and 980 deletions

View file

@ -12,27 +12,28 @@ import tempfile
from typing import Any
import yaml
from llama_stack_api.apis.agents import Agents
from llama_stack_api.apis.batches import Batches
from llama_stack_api.apis.benchmarks import Benchmarks
from llama_stack_api.apis.conversations import Conversations
from llama_stack_api.apis.datasetio import DatasetIO
from llama_stack_api.apis.datasets import Datasets
from llama_stack_api.apis.eval import Eval
from llama_stack_api.apis.files import Files
from llama_stack_api.apis.inference import Inference
from llama_stack_api.apis.inspect import Inspect
from llama_stack_api.apis.models import Models
from llama_stack_api.apis.post_training import PostTraining
from llama_stack_api.apis.prompts import Prompts
from llama_stack_api.apis.providers import Providers
from llama_stack_api.apis.safety import Safety
from llama_stack_api.apis.scoring import Scoring
from llama_stack_api.apis.scoring_functions import ScoringFunctions
from llama_stack_api.apis.shields import Shields
from llama_stack_api.apis.tools import ToolGroups, ToolRuntime
from llama_stack_api.apis.vector_io import VectorIO
from llama_stack_api.providers.datatypes import Api
from llama_stack.apis.agents import Agents
from llama_stack.apis.batches import Batches
from llama_stack.apis.benchmarks import Benchmarks
from llama_stack.apis.conversations import Conversations
from llama_stack.apis.datasetio import DatasetIO
from llama_stack.apis.datasets import Datasets
from llama_stack.apis.eval import Eval
from llama_stack.apis.files import Files
from llama_stack.apis.inference import Inference
from llama_stack.apis.inspect import Inspect
from llama_stack.apis.models import Models
from llama_stack.apis.post_training import PostTraining
from llama_stack.apis.prompts import Prompts
from llama_stack.apis.providers import Providers
from llama_stack.apis.safety import Safety
from llama_stack.apis.scoring import Scoring
from llama_stack.apis.scoring_functions import ScoringFunctions
from llama_stack.apis.shields import Shields
from llama_stack.apis.tools import ToolGroups, ToolRuntime
from llama_stack.apis.vector_io import VectorIO
from llama_stack.core.conversations.conversations import ConversationServiceConfig, ConversationServiceImpl
from llama_stack.core.datatypes import Provider, SafetyConfig, StackRunConfig, VectorStoresConfig
from llama_stack.core.distribution import get_provider_registry
@ -54,7 +55,6 @@ from llama_stack.core.storage.datatypes import (
from llama_stack.core.store.registry import create_dist_registry
from llama_stack.core.utils.dynamic import instantiate_class_type
from llama_stack.log import get_logger
from llama_stack.providers.datatypes import Api
logger = get_logger(name=__name__, category="core")