mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-04 20:14:13 +00:00
The `trl` dependency brings in `accelerate` which brings in nvidia dependencies for torch. We cannot have that in the starter distro. As such, no CPU-only post-training for the huggingface provider.
250 B
250 B
Post_Training
Overview
This section contains documentation for all available providers for the post_training API.
Providers
:maxdepth: 1
inline_huggingface-gpu
inline_torchtune-cpu
inline_torchtune-gpu
remote_nvidia