llama-stack-mirror/llama_stack/providers/remote
Ben Browning fcdeb3d7bf OpenAI completion prompt can also include tokens
The OpenAI completion API supports strings, array of strings, array of
tokens, or array of token arrays. So, expand our type hinting to
support all of these types.

Signed-off-by: Ben Browning <bbrownin@redhat.com>
2025-04-09 15:47:02 -04:00
..
agents test: add unit test to ensure all config types are instantiable (#1601) 2025-03-12 22:29:58 -07:00
datasetio refactor: extract pagination logic into shared helper function (#1770) 2025-03-31 13:08:29 -07:00
inference OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
post_training refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00
safety feat: added nvidia as safety provider (#1248) 2025-03-17 14:39:23 -07:00
tool_runtime fix(api): don't return list for runtime tools (#1686) 2025-04-01 09:53:11 +02:00
vector_io chore: Updating Milvus Client calls to be non-blocking (#1830) 2025-03-28 22:14:07 -04:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00