llama-stack/llama_stack/cli
Ashwin Bharambe 0713607b68
Support parallel downloads for llama model download (#448)
# What does this PR do?

Enables parallel downloads for `llama model download` CLI command. It is
rather necessary for folks having high bandwidth connections to the
Internet in order to download checkpoints quickly.

## Test Plan


![image](https://github.com/user-attachments/assets/f5df69e2-ec4f-4360-bf84-91273d8cee22)
2024-11-14 09:56:22 -08:00
..
model No automatic pager 2024-10-02 12:26:09 -07:00
scripts API Updates (#73) 2024-09-17 19:51:35 -07:00
stack Fix build configure deprecation message (#456) 2024-11-14 09:56:03 -08:00
tests Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
download.py Support parallel downloads for llama model download (#448) 2024-11-14 09:56:22 -08:00
llama.py API Updates (#73) 2024-09-17 19:51:35 -07:00
subcommand.py API Updates (#73) 2024-09-17 19:51:35 -07:00
table.py API Updates (#73) 2024-09-17 19:51:35 -07:00