llama-stack-mirror/llama_stack/providers
Nehanth Narendrula 58ffd82853
fix: Update SFTConfig parameter to fix CI and Post Training Workflow (#2948)
# What does this PR do?

- Change max_seq_length to max_length in SFTConfig constructor
- TRL deprecated max_seq_length in Feb 2024 and removed it in v0.20.0
- Reference: https://github.com/huggingface/trl/pull/2895

This resolves the SFT training failure in CI tests
2025-07-29 11:14:04 -07:00
..
inline fix: Update SFTConfig parameter to fix CI and Post Training Workflow (#2948) 2025-07-29 11:14:04 -07:00
registry feat: implement chunk deletion for vector stores (#2701) 2025-07-25 10:30:30 -04:00
remote feat(openai): add configurable base_url support with OPENAI_BASE_URL env var (#2919) 2025-07-28 10:16:02 -07:00
utils feat: implement dynamic model detection support for inference providers using litellm (#2886) 2025-07-28 10:13:54 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat(starter)!: simplify starter distro; litellm model registry changes (#2916) 2025-07-25 15:02:04 -07:00