llama-stack-mirror/llama_stack/providers/inline/post_training
Nehanth Narendrula 58ffd82853
fix: Update SFTConfig parameter to fix CI and Post Training Workflow (#2948)
# What does this PR do?

- Change max_seq_length to max_length in SFTConfig constructor
- TRL deprecated max_seq_length in Feb 2024 and removed it in v0.20.0
- Reference: https://github.com/huggingface/trl/pull/2895

This resolves the SFT training failure in CI tests
2025-07-29 11:14:04 -07:00
..
common feat: add huggingface post_training impl (#2132) 2025-05-16 14:41:28 -07:00
huggingface fix: Update SFTConfig parameter to fix CI and Post Training Workflow (#2948) 2025-07-29 11:14:04 -07:00
torchtune chore: add mypy post training (#2675) 2025-07-09 15:44:39 +02:00
__init__.py Add init files to post training folders (#711) 2025-01-13 20:19:18 -08:00