llama-stack/llama_stack
ehhuang 8feb1827c8
fix: openai provider model id (#2229)
# What does this PR do?
Since https://github.com/meta-llama/llama-stack/pull/2193 switched to
openai sdk, we need to strip 'openai/' from the model_id


## Test Plan
start server with openai provider and send a chat completion call
2025-05-22 14:51:01 -07:00
..
apis feat(sqlite-vec): enable keyword search for sqlite-vec (#1439) 2025-05-21 15:24:24 -04:00
cli feat: add llama stack rm command (#2127) 2025-05-21 10:25:51 +02:00
distribution feat: implement get chat completions APIs (#2200) 2025-05-21 22:21:52 -07:00
models fix: llama4 tool use prompt fix (#2103) 2025-05-06 22:18:31 -07:00
providers fix: openai provider model id (#2229) 2025-05-22 14:51:01 -07:00
strong_typing chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
templates feat: implement get chat completions APIs (#2200) 2025-05-21 22:21:52 -07:00
ui chore: Updated readme (#2219) 2025-05-20 17:06:20 +02:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
schema_utils.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00