mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-27 18:50:41 +00:00
torchrun --nproc_per_node=8 scripts/generate_prompt_format.py meta-llama/Llama-4-Scout-17B-16E-Instruct ~/local/checkpoints/<path>/ llama_stack.models.llama.llama4.prompts llama_stack/models/llama/llama4/prompt_format.md Co-authored-by: Eric Huang <erichuang@fb.com> |
||
---|---|---|
.. | ||
apis | ||
cli | ||
distribution | ||
models | ||
providers | ||
strong_typing | ||
templates | ||
__init__.py | ||
env.py | ||
log.py | ||
schema_utils.py |