llama-stack-mirror/llama_stack/providers/impls/meta_reference/inference
Russell Bryant 5828ffd53b
inference: Fix download command in error msg (#133)
I got this error message and tried to the run the command presented
and it didn't work. The model needs to be give with `--model-id`
instead of as a positional argument.

Signed-off-by: Russell Bryant <rbryant@redhat.com>
2024-09-27 13:31:11 -07:00
..
quantization Add a test runner and 2 very simple tests for agents 2024-09-19 12:22:48 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
config.py Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
generation.py inference: Fix download command in error msg (#133) 2024-09-27 13:31:11 -07:00
inference.py Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
model_parallel.py API Updates (#73) 2024-09-17 19:51:35 -07:00
parallel_utils.py API Updates (#73) 2024-09-17 19:51:35 -07:00