chore: use remoteinferenceproviderconfig for remote inference providers (#3668)

# What does this PR do?

on the path to maintainable impls of inference providers. make all
configs instances of RemoteInferenceProviderConfig.

## Test Plan

ci
This commit is contained in:
Matthew Farrellee 2025-10-03 11:48:42 -04:00 committed by GitHub
parent a20e8eac8c
commit ce77c27ff8
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
37 changed files with 65 additions and 26 deletions

View file

@ -8,6 +8,7 @@ from typing import Any
from pydantic import BaseModel, Field
from llama_stack.providers.utils.inference.model_registry import RemoteInferenceProviderConfig
from llama_stack.schema_utils import json_schema_type
@ -19,7 +20,7 @@ class OpenAIProviderDataValidator(BaseModel):
@json_schema_type
class OpenAIConfig(BaseModel):
class OpenAIConfig(RemoteInferenceProviderConfig):
api_key: str | None = Field(
default=None,
description="API key for OpenAI models",