llama-stack-mirror/llama_stack/models/llama
Sébastien Han ba7e95e035
ci: add python package build test
* We now test a package build on every PRs.
* Add precommit rule to validate the presence of __init__.py in
  directoies that contain python files

Closes: https://github.com/meta-llama/llama-stack/issues/2406
Signed-off-by: Sébastien Han <seb@redhat.com>
2025-06-19 11:08:30 +02:00
..
llama3 chore: remove usage of load_tiktoken_bpe (#2276) 2025-06-02 07:33:37 -07:00
llama3_1 chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
llama3_2 refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00
llama3_3 ci: add python package build test 2025-06-19 11:08:30 +02:00
llama4 ci: add python package build test 2025-06-19 11:08:30 +02:00
resources feat: introduce llama4 support (#1877) 2025-04-05 11:53:35 -07:00
__init__.py feat: introduce llama4 support (#1877) 2025-04-05 11:53:35 -07:00
checkpoint.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
datatypes.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
hadamard_utils.py refactor: move all llama code to models/llama out of meta reference (#1887) 2025-04-07 15:03:58 -07:00
prompt_format.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
quantize_impls.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
sku_list.py chore: more mypy fixes (#2029) 2025-05-06 09:52:31 -07:00
sku_types.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
tokenizer_utils.py chore: remove usage of load_tiktoken_bpe (#2276) 2025-06-02 07:33:37 -07:00