llama-stack-mirror/llama_stack
Ashwin Bharambe be3adb0964 Make vllm inference better
Tests still don't pass completely (some hang) so I think there are some
potential threading issues maybe
2024-10-25 12:03:42 -07:00
..
apis [Evals API][3/n] scoring_functions / scoring meta-reference implementations (#296) 2024-10-24 14:52:30 -07:00
cli move build.yaml to templates, symlink in distributions 2024-10-25 11:54:09 -07:00
distribution start_container.sh prefix llamastack->distribution name 2024-10-24 21:29:17 -07:00
providers Make vllm inference better 2024-10-25 12:03:42 -07:00
scripts Add a test for CLI, but not fully done so disabled 2024-09-19 13:27:07 -07:00
templates move build.yaml to templates, symlink in distributions 2024-10-25 11:54:09 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00