llama-stack-mirror/llama_stack
Yuan Tang a1da09e166
feat: Support "stop" parameter in remote:vLLM
Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-03-19 22:41:34 -04:00
..
apis feat: [New Eval Benchamark] IfEval (#1708) 2025-03-19 16:39:59 -07:00
cli refactor: simplify command execution and remove PTY handling (#1641) 2025-03-17 15:03:14 -07:00
distribution fix: Correctly set CLI_ARGS using BUILD_PLATFORM env with llama stack… (#1702) 2025-03-19 16:18:11 -07:00
models/llama feat: Support "stop" parameter in remote:vLLM 2025-03-19 22:41:34 -04:00
providers feat: Support "stop" parameter in remote:vLLM 2025-03-19 22:41:34 -04:00
strong_typing Ensure that deprecations for fields follow through to OpenAPI 2025-02-19 13:54:04 -08:00
templates feat: [New Eval Benchamark] IfEval (#1708) 2025-03-19 16:39:59 -07:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
env.py refactor(test): move tools, evals, datasetio, scoring and post training tests (#1401) 2025-03-04 14:53:47 -08:00
log.py feat: add support for logging config in the run.yaml (#1408) 2025-03-14 12:36:25 -07:00
schema_utils.py ci: add mypy for static type checking (#1101) 2025-02-21 13:15:40 -08:00