llama-stack/llama_stack/providers/utils/inference
ehhuang 7b4eb0967e
test: verification on provider's OAI endpoints (#1893)
# What does this PR do?


## Test Plan
export MODEL=accounts/fireworks/models/llama4-scout-instruct-basic;
LLAMA_STACK_CONFIG=verification pytest -s -v tests/integration/inference
--vision-model $MODEL --text-model $MODEL
2025-04-07 23:06:28 -07:00
..
__init__.py refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00
embedding_mixin.py fix: dont assume SentenceTransformer is imported 2025-02-25 16:53:01 -08:00
litellm_openai_mixin.py feat: introduce llama4 support (#1877) 2025-04-05 11:53:35 -07:00
model_registry.py test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
openai_compat.py test: verification on provider's OAI endpoints (#1893) 2025-04-07 23:06:28 -07:00
prompt_adapter.py refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00