mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-29 03:14:19 +00:00
I got this error message and tried to the run the command presented and it didn't work. The model needs to be give with `--model-id` instead of as a positional argument. Signed-off-by: Russell Bryant <rbryant@redhat.com> |
||
---|---|---|
.. | ||
quantization | ||
__init__.py | ||
config.py | ||
generation.py | ||
inference.py | ||
model_parallel.py | ||
parallel_utils.py |