llama-stack-mirror/llama_stack
Ben Browning fcdeb3d7bf OpenAI completion prompt can also include tokens
The OpenAI completion API supports strings, array of strings, array of
tokens, or array of token arrays. So, expand our type hinting to
support all of these types.

Signed-off-by: Ben Browning <bbrownin@redhat.com>
2025-04-09 15:47:02 -04:00
..
apis OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
cli refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00
distribution OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
models fix: Mirror llama4 rope scaling fixes, small model simplify (#1917) 2025-04-09 11:28:45 -07:00
providers OpenAI completion prompt can also include tokens 2025-04-09 15:47:02 -04:00
strong_typing chore: more mypy checks (ollama, vllm, ...) (#1777) 2025-04-01 17:12:39 +02:00
templates docs: Update remote-vllm.md with AMD GPU vLLM server supported. (#1858) 2025-04-08 21:35:32 -07:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py chore: Remove style tags from log formatter (#1808) 2025-03-27 10:18:21 -04:00
schema_utils.py chore: make mypy happy with webmethod (#1758) 2025-03-22 08:17:23 -07:00