llama-stack/llama_stack/providers/remote/inference/openai
ehhuang 8feb1827c8
fix: openai provider model id (#2229)
# What does this PR do?
Since https://github.com/meta-llama/llama-stack/pull/2193 switched to
openai sdk, we need to strip 'openai/' from the model_id


## Test Plan
start server with openai provider and send a chat completion call
2025-05-22 14:51:01 -07:00
..
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
config.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
models.py feat: expand set of known openai models, allow using openai canonical model names (#2164) 2025-05-14 13:18:15 -07:00
openai.py fix: openai provider model id (#2229) 2025-05-22 14:51:01 -07:00