llama-stack-mirror/llama_stack/providers/inline/post_training/torchtune
Ihar Hrachyshka c219a74fa0
fix: Don't require efficiency_config for torchtune (#2104)
# What does this PR do?

Revert a change that by mistake forced efficiency_config on torchtune
provider
users.

```
    fix: Don't require efficiency_config for torchtune

    It was enforced by mistake when
    0751a960a5 merged.

    Other asserts made sense in that the code was written, potentially, to
    always expect a non-None value. But not efficiency_config.
```

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-05-06 09:50:44 -07:00
..
common chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
datasets chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
recipes fix: Don't require efficiency_config for torchtune (#2104) 2025-05-06 09:50:44 -07:00
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
config.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
post_training.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00