llama-stack-mirror/llama_stack/providers/remote/inference/openai
Eric Huang f1f179d8ca fix: openai provider model id
# What does this PR do?
Since https://github.com/meta-llama/llama-stack/pull/2193 switched to openai sdk, we need to strip 'openai/' from the model_id


## Test Plan
start server with openai provider and send a chat completion call
2025-05-22 12:10:57 -07:00
..
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
config.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
models.py feat: expand set of known openai models, allow using openai canonical model names (#2164) 2025-05-14 13:18:15 -07:00
openai.py fix: openai provider model id 2025-05-22 12:10:57 -07:00