llama-stack-mirror/llama_stack/providers/remote/inference
Bill Murdock 999c28e809 fix: Update Watsonx provider to use LiteLLM mixin and list all models
Signed-off-by: Bill Murdock <bmurdock@redhat.com>
2025-10-03 15:10:23 -04:00
..
anthropic chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
azure chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
bedrock chore: remove deprecated inference.chat_completion implementations (#3654) 2025-10-03 07:55:34 -04:00
cerebras chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
databricks chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
fireworks chore: remove deprecated inference.chat_completion implementations (#3654) 2025-10-03 07:55:34 -04:00
gemini chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
groq chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
llama_openai_compat chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
nvidia chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
ollama chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
openai chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
passthrough chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
runpod chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
sambanova chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
tgi chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
together chore: remove deprecated inference.chat_completion implementations (#3654) 2025-10-03 07:55:34 -04:00
vertexai chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
vllm chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
watsonx fix: Update Watsonx provider to use LiteLLM mixin and list all models 2025-10-03 15:10:23 -04:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00