llama-stack-mirror/llama_stack/providers/impls
Russell Bryant bec6ab78c2 inference: Fix download command in error msg
I got this error message and tried to the run the command presented
and it didn't work. The model needs to be give with `--model-id`
instead of as a positional argument.

Signed-off-by: Russell Bryant <rbryant@redhat.com>
2024-09-27 14:14:07 +00:00
..
ios/inference Drop header from LocalInference.h 2024-09-25 11:27:37 -07:00
meta_reference inference: Fix download command in error msg 2024-09-27 14:14:07 +00:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00