llama-stack/llama_stack/providers/remote
ehhuang 8feb1827c8
fix: openai provider model id (#2229)
# What does this PR do?
Since https://github.com/meta-llama/llama-stack/pull/2193 switched to
openai sdk, we need to strip 'openai/' from the model_id


## Test Plan
start server with openai provider and send a chat completion call
2025-05-22 14:51:01 -07:00
..
agents test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
datasetio chore(refact): move paginate_records fn outside of datasetio (#2137) 2025-05-12 10:56:14 -07:00
eval chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
inference fix: openai provider model id (#2229) 2025-05-22 14:51:01 -07:00
post_training fix: Pass model parameter as config name to NeMo Customizer (#2218) 2025-05-20 09:51:39 -07:00
safety feat(providers): sambanova safety provider (#2221) 2025-05-21 15:33:02 -07:00
tool_runtime chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
vector_io feat(sqlite-vec): enable keyword search for sqlite-vec (#1439) 2025-05-21 15:24:24 -04:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00