llama-stack-mirror/llama_stack/templates
Charlie Doern 6494658a10 feat: add finetune_multi_device recipe with fsdp support
the HF SFTTrainer supports distributed training using FSDP.

Add a new recipe, `finetune_multi_device` which supports multi-GPU (cuda) training
using FSDP and optionally LoRA.

transformers hides _alot_ of their usage of FSDP behind the training args:
a6b51e7341/src/transformers/training_args.py (L1535)

you need to pass both `fsdp` and `fsdp_config` to get it to work properly. However,
it seems many of the `fsdp_config` entries are silently ignored. The key things to get this working were:
full_shard
offload (cpu offload)
transformer_layer_cls_to_wrap (model specific wrapping)
cpu_ram_efficient_loading
sharding_strategy
limit_all_gathers
sync_module_states
backward_prefetch
use_orig_params

these can be seen both in `fsdp=` and `fsdp_config=` int he `SFTConfig` call.

I have tested this with different model architectures with and without LoRA with success.

the user can now toggle `recipe` in their provider config between `single` and `multi` to access the two different recipes.

for debugging purposes NCCL logging settings can now be accessed via the provider config as well

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-06-12 13:33:33 -04:00
..
bedrock revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
cerebras revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
ci-tests revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
dell revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
experimental-post-training feat: add huggingface post_training impl (#2132) 2025-05-16 14:41:28 -07:00
fireworks feat: reference implementation for files API (#2330) 2025-06-02 21:54:24 -07:00
groq feat(distro): add more providers to starter distro, prefix conflicting models (#2362) 2025-06-03 12:10:46 -07:00
hf-endpoint revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
hf-serverless revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
llama_api revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
meta-reference-gpu revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
nvidia revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
ollama feat: add finetune_multi_device recipe with fsdp support 2025-06-12 13:33:33 -04:00
open-benchmark revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
passthrough revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
postgres-demo feat: add deps dynamically based on metastore config (#2405) 2025-06-05 14:07:25 -07:00
remote-vllm revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
sambanova revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
starter fix: vllm starter name (#2392) 2025-06-04 16:21:36 +02:00
tgi revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
together revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
vllm-gpu revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
watsonx revert: "chore: Remove zero-width space characters from OTEL service" (#2331) 2025-06-02 14:21:35 -07:00
__init__.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
template.py feat: add deps dynamically based on metastore config (#2405) 2025-06-05 14:07:25 -07:00