From 3a002f6cf190ea3be628a843cadc71fa152e9847 Mon Sep 17 00:00:00 2001 From: Reid <61492567+reidliu41@users.noreply.github.com> Date: Wed, 26 Feb 2025 13:38:10 +0800 Subject: [PATCH] chore: update download error message (#1217) # What does this PR do? [Provide a short summary of what this PR does and why. Link to relevant issues if applicable.] Actually, the incorrect token also will hit `RepositoryNotFoundError`, e.g. ``` $ llama model download --source huggingface --model-id Llama3.2-1B-Instruct:int4-qlora-eo8 --hf-token xx ### xx is incorrect token ----RepositoryNotFoundError---> usage: llama model download [-h] [--source {meta,huggingface}] [--model-id MODEL_ID] [--hf-token HF_TOKEN] [--meta-url META_URL] [--max-parallel MAX_PARALLEL] [--ignore-patterns IGNORE_PATTERNS] [--manifest-file MANIFEST_FILE] llama model download: error: Repository 'meta-llama/Llama-3.2-1B-Instruct-QLORA_INT4_EO8' not found on the Hugging Face Hub. so update to: llama model download --source huggingface --model-id Llama3.2-1B-Instruct:int4-qlora-eo8 --hf-token xx ----RepositoryNotFoundError---> usage: llama model download [-h] [--source {meta,huggingface}] [--model-id MODEL_ID] [--hf-token HF_TOKEN] [--meta-url META_URL] [--max-parallel MAX_PARALLEL] [--ignore-patterns IGNORE_PATTERNS] [--manifest-file MANIFEST_FILE] llama model download: error: Repository 'meta-llama/Llama-3.2-1B-Instruct-QLORA_INT4_EO8' not found on the Hugging Face Hub or incorrect Hugging Face token. ``` [//]: # (If resolving an issue, uncomment and update the line below) [//]: # (Closes #[issue-number]) ## Test Plan [Describe the tests you ran to verify your changes with result summaries. *Provide clear instructions so the plan can be easily re-executed.*] [//]: # (## Documentation) Signed-off-by: reidliu Co-authored-by: reidliu --- llama_stack/cli/download.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/llama_stack/cli/download.py b/llama_stack/cli/download.py index af86f7243..b43d50217 100644 --- a/llama_stack/cli/download.py +++ b/llama_stack/cli/download.py @@ -343,7 +343,7 @@ def _hf_download( "You can find your token by visiting https://huggingface.co/settings/tokens" ) except RepositoryNotFoundError: - parser.error(f"Repository '{repo_id}' not found on the Hugging Face Hub.") + parser.error(f"Repository '{repo_id}' not found on the Hugging Face Hub or incorrect Hugging Face token.") except Exception as e: parser.error(e)