llama-stack-mirror/llama_stack/providers/utils/inference
Ben Browning 00c4493bda OpenAI-compatible completions and chats for litellm and together
This adds OpenAI-compatible completions and chat completions support
for the native Together provider as well as all providers implemented
with litellm.
2025-04-09 15:47:02 -04:00
..
__init__.py refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00
embedding_mixin.py fix: dont assume SentenceTransformer is imported 2025-02-25 16:53:01 -08:00
litellm_openai_mixin.py OpenAI-compatible completions and chats for litellm and together 2025-04-09 15:47:02 -04:00
model_registry.py test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
openai_compat.py OpenAI-compatible completions and chats for litellm and together 2025-04-09 15:47:02 -04:00
prompt_adapter.py refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00