llama-stack-mirror/llama_stack/providers
ehhuang 8feb1827c8
fix: openai provider model id (#2229)
# What does this PR do?
Since https://github.com/meta-llama/llama-stack/pull/2193 switched to
openai sdk, we need to strip 'openai/' from the model_id


## Test Plan
start server with openai provider and send a chat completion call
2025-05-22 14:51:01 -07:00
..
inline feat(sqlite-vec): enable keyword search for sqlite-vec (#1439) 2025-05-21 15:24:24 -04:00
registry feat(providers): sambanova safety provider (#2221) 2025-05-21 15:33:02 -07:00
remote fix: openai provider model id (#2229) 2025-05-22 14:51:01 -07:00
utils feat: implement get chat completions APIs (#2200) 2025-05-21 22:21:52 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00