fix: Update SFTConfig parameter to fix CI and Post Training Workflow (#2948)

# What does this PR do?

- Change max_seq_length to max_length in SFTConfig constructor
- TRL deprecated max_seq_length in Feb 2024 and removed it in v0.20.0
- Reference: https://github.com/huggingface/trl/pull/2895

This resolves the SFT training failure in CI tests
This commit is contained in:
Nehanth Narendrula 2025-07-29 14:14:04 -04:00 committed by GitHub
parent c7dc0f21b4
commit 58ffd82853
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -469,7 +469,7 @@ class HFFinetuningSingleDevice:
use_cpu=True if device.type == "cpu" and not torch.backends.mps.is_available() else False,
save_strategy=save_strategy,
report_to="none",
max_seq_length=provider_config.max_seq_length,
max_length=provider_config.max_seq_length,
gradient_accumulation_steps=config.gradient_accumulation_steps,
gradient_checkpointing=provider_config.gradient_checkpointing,
learning_rate=lr,