llama-stack-mirror/llama_stack/providers
Matthew Farrellee de9940c697
chore: disable openai_embeddings on inference=remote::llama-openai-compat (#3704)
# What does this PR do?

api.llama.com does not provide embedding models, this makes that clear


## Test Plan

ci
2025-10-06 13:27:40 -04:00
..
inline feat(api): add extra_body parameter support with shields example (#3670) 2025-10-03 13:25:09 -07:00
registry chore: turn OpenAIMixin into a pydantic.BaseModel (#3671) 2025-10-06 11:33:19 -04:00
remote chore: disable openai_embeddings on inference=remote::llama-openai-compat (#3704) 2025-10-06 13:27:40 -04:00
utils chore: turn OpenAIMixin into a pydantic.BaseModel (#3671) 2025-10-06 11:33:19 -04:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: combine ProviderSpec datatypes (#3378) 2025-09-18 16:10:00 +02:00