llama-stack-mirror/llama_stack/providers/remote/inference/llama_openai_compat
Matthew Farrellee de9940c697
chore: disable openai_embeddings on inference=remote::llama-openai-compat (#3704)
# What does this PR do?

api.llama.com does not provide embedding models, this makes that clear


## Test Plan

ci
2025-10-06 13:27:40 -04:00
..
__init__.py chore: turn OpenAIMixin into a pydantic.BaseModel (#3671) 2025-10-06 11:33:19 -04:00
config.py chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
llama.py chore: disable openai_embeddings on inference=remote::llama-openai-compat (#3704) 2025-10-06 13:27:40 -04:00