llama-stack-mirror/llama_stack/providers/remote/inference
Ben Browning fcdeb3d7bf OpenAI completion prompt can also include tokens
The OpenAI completion API supports strings, array of strings, array of
tokens, or array of token arrays. So, expand our type hinting to
support all of these types.

Signed-off-by: Ben Browning <bbrownin@redhat.com>
2025-04-09 15:47:02 -04:00
..
anthropic feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
bedrock Add unsupported OpenAI mixin to all remaining inference providers 2025-04-09 15:47:02 -04:00
cerebras Add unsupported OpenAI mixin to all remaining inference providers 2025-04-09 15:47:02 -04:00
cerebras_openai_compat test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
databricks Add unsupported OpenAI mixin to all remaining inference providers 2025-04-09 15:47:02 -04:00
fireworks test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
fireworks_openai_compat test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
gemini feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
groq test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
groq_openai_compat test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
nvidia Add unsupported OpenAI mixin to all remaining inference providers 2025-04-09 15:47:02 -04:00
ollama OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
openai feat(providers): Groq now uses LiteLLM openai-compat (#1303) 2025-02-27 13:16:50 -08:00
passthrough OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
runpod Add unsupported OpenAI mixin to all remaining inference providers 2025-04-09 15:47:02 -04:00
sambanova Add unsupported OpenAI mixin to all remaining inference providers 2025-04-09 15:47:02 -04:00
sambanova_openai_compat test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
tgi Add unsupported OpenAI mixin to all remaining inference providers 2025-04-09 15:47:02 -04:00
together OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
together_openai_compat test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
vllm OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00