chore: use remoteinferenceproviderconfig for remote inference providers (#3668)

# What does this PR do?

on the path to maintainable impls of inference providers. make all
configs instances of RemoteInferenceProviderConfig.

## Test Plan

ci
This commit is contained in:
Matthew Farrellee 2025-10-03 11:48:42 -04:00 committed by GitHub
parent a20e8eac8c
commit ce77c27ff8
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
37 changed files with 65 additions and 26 deletions

View file

@ -7,13 +7,14 @@
import os
from typing import Any
from pydantic import BaseModel, Field, SecretStr
from pydantic import Field, SecretStr
from llama_stack.providers.utils.inference.model_registry import RemoteInferenceProviderConfig
from llama_stack.schema_utils import json_schema_type
@json_schema_type
class NVIDIAConfig(BaseModel):
class NVIDIAConfig(RemoteInferenceProviderConfig):
"""
Configuration for the NVIDIA NIM inference endpoint.