llama-stack-mirror/llama_stack/providers/remote/inference
2025-06-13 14:40:55 +05:30
..
anthropic chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
bedrock feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
cerebras feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
cerebras_openai_compat feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
databricks feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
fireworks fix: fireworks provider for openai compat inference endpoint (#2335) 2025-06-02 14:11:15 -07:00
fireworks_openai_compat feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
gemini chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
groq chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
groq_openai_compat feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
llama_openai_compat feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
nvidia feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
ollama refactor: unify stream and non-stream impls for responses (#2388) 2025-06-05 17:48:09 +02:00
openai feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
passthrough feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
runpod feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
sambanova fix(providers): update sambanova json schema mode (#2306) 2025-05-29 09:54:23 -07:00
sambanova_openai_compat feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
tgi feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
together feat: New OpenAI compat embeddings API (#2314) 2025-05-31 22:11:47 -07:00
together_openai_compat feat: introduce APIs for retrieving chat completion requests (#2145) 2025-05-18 21:43:19 -07:00
vllm feat: To add health status check for remote VLLM (#2303) 2025-06-06 15:33:12 -04:00
watsonx fix var name 2025-06-13 14:40:55 +05:30
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00