llama-stack-mirror/llama_stack/providers/impls/meta_reference/inference
Russell Bryant bec6ab78c2 inference: Fix download command in error msg
I got this error message and tried to the run the command presented
and it didn't work. The model needs to be give with `--model-id`
instead of as a positional argument.

Signed-off-by: Russell Bryant <rbryant@redhat.com>
2024-09-27 14:14:07 +00:00
..
quantization Add a test runner and 2 very simple tests for agents 2024-09-19 12:22:48 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
config.py Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
generation.py inference: Fix download command in error msg 2024-09-27 14:14:07 +00:00
inference.py Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
model_parallel.py API Updates (#73) 2024-09-17 19:51:35 -07:00
parallel_utils.py API Updates (#73) 2024-09-17 19:51:35 -07:00