llama-stack-mirror/llama_stack/providers
Yuan Tang a1da09e166
feat: Support "stop" parameter in remote:vLLM
Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-03-19 22:41:34 -04:00
..
inline feat: [New Eval Benchamark] IfEval (#1708) 2025-03-19 16:39:59 -07:00
registry feat: [New Eval Benchamark] IfEval (#1708) 2025-03-19 16:39:59 -07:00
remote fix: Updating ToolCall.arguments to allow for json strings that can be decoded on client side (#1685) 2025-03-19 10:36:19 -07:00
tests refactor(test): introduce --stack-config and simplify options (#1404) 2025-03-05 17:02:02 -08:00
utils feat: Support "stop" parameter in remote:vLLM 2025-03-19 22:41:34 -04:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00