forked from phoenix-oss/llama-stack-mirror
torchrun --nproc_per_node=8 scripts/generate_prompt_format.py meta-llama/Llama-4-Scout-17B-16E-Instruct ~/local/checkpoints/<path>/ llama_stack.models.llama.llama4.prompts llama_stack/models/llama/llama4/prompt_format.md Co-authored-by: Eric Huang <erichuang@fb.com> |
||
|---|---|---|
| .. | ||
| llama3 | ||
| llama3_1 | ||
| llama3_2 | ||
| llama3_3 | ||
| llama4 | ||
| resources | ||
| __init__.py | ||
| checkpoint.py | ||
| datatypes.py | ||
| hadamard_utils.py | ||
| prompt_format.py | ||
| quantize_impls.py | ||
| sku_list.py | ||
| sku_types.py | ||