forked from phoenix-oss/llama-stack-mirror
## context The documentation around model download from meta source part https://llama-stack.readthedocs.io/en/latest/references/llama_cli_reference/index.html#downloading-from-meta confused me and another colleague because we met [issue](https://github.com/meta-llama/llama-stack/issues/746) during downloading. After some debugging, I found that we need to quote META_URL in the command. To avoid other users have the same confusion, I updated the doc tor make it more clear ## test before  after  |
||
|---|---|---|
| .. | ||
| api_reference | ||
| evals_reference | ||
| llama_cli_reference | ||
| python_sdk_reference | ||
| index.md | ||
| llama_stack_client_cli_reference.md | ||