..
anthropic
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
azure
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
bedrock
chore: remove deprecated inference.chat_completion implementations ( #3654 )
2025-10-03 07:55:34 -04:00
cerebras
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
databricks
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
fireworks
chore: remove deprecated inference.chat_completion implementations ( #3654 )
2025-10-03 07:55:34 -04:00
gemini
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
groq
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
llama_openai_compat
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
nvidia
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
ollama
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
openai
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
passthrough
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
runpod
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
sambanova
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
tgi
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
together
chore: remove deprecated inference.chat_completion implementations ( #3654 )
2025-10-03 07:55:34 -04:00
vertexai
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
vllm
feat: implement graceful model discovery for vLLM provider
2025-10-03 21:32:15 +02:00
watsonx
chore: use remoteinferenceproviderconfig for remote inference providers ( #3668 )
2025-10-03 08:48:42 -07:00
__init__.py
impls
-> inline
, adapters
-> remote
(#381 )
2024-11-06 14:54:05 -08:00