llama-stack/llama_stack/providers
Botao Chen 35bf6ea75a
Pin torchtune pkg version (#791)
## context
This is the follow up of
https://github.com/meta-llama/llama-stack/pull/674. Since torchtune is
still in alpha stage and the apis are not guarantee backward compatible.
Pin the torchtune and torchao pkg version to avoid the latest torchtune
release breaks llama stack post training.

We will bump the version number manually after with the new pkg release
some testing

## test 
ping an old torchtune pkg version (0.4.0) and the 0.4.0 was installed 
<img width="1016" alt="Screenshot 2025-01-16 at 3 06 47 PM"
src="https://github.com/user-attachments/assets/630b05d0-8d0d-4e2f-8b48-22e578a62659"
/>
2025-01-16 16:31:13 -08:00
..
inline Make llama stack build not create a new conda by default (#788) 2025-01-16 13:44:53 -08:00
registry Pin torchtune pkg version (#791) 2025-01-16 16:31:13 -08:00
remote fireworks add completion logprobs adapter (#778) 2025-01-16 10:37:07 -08:00
tests [Test automation] generate custom test report (#739) 2025-01-16 15:33:50 -08:00
utils Idiomatic REST API: Telemetry (#786) 2025-01-16 12:08:46 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py Tools API with brave and MCP providers (#639) 2024-12-19 21:25:17 -08:00