chore: use remoteinferenceproviderconfig for remote inference providers (#3668)

# What does this PR do?

on the path to maintainable impls of inference providers. make all
configs instances of RemoteInferenceProviderConfig.

## Test Plan

ci
This commit is contained in:
Matthew Farrellee 2025-10-03 11:48:42 -04:00 committed by GitHub
parent a20e8eac8c
commit ce77c27ff8
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
37 changed files with 65 additions and 26 deletions

View file

@ -6,12 +6,14 @@
from typing import Any
from pydantic import BaseModel, Field
from pydantic import Field
from llama_stack.providers.utils.inference.model_registry import RemoteInferenceProviderConfig
DEFAULT_OLLAMA_URL = "http://localhost:11434"
class OllamaImplConfig(BaseModel):
class OllamaImplConfig(RemoteInferenceProviderConfig):
url: str = DEFAULT_OLLAMA_URL
refresh_models: bool = Field(
default=False,