llama-stack-mirror/llama_stack/providers/remote/inference/openai
Matthew Farrellee ce77c27ff8
chore: use remoteinferenceproviderconfig for remote inference providers (#3668)
# What does this PR do?

on the path to maintainable impls of inference providers. make all
configs instances of RemoteInferenceProviderConfig.

## Test Plan

ci
2025-10-03 08:48:42 -07:00
..
__init__.py chore: remove duplicate OpenAI and Gemini data validators (#3513) 2025-09-22 13:53:17 +02:00
config.py chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
openai.py chore(api): remove batch inference (#3261) 2025-09-26 14:35:34 -07:00