..
llama3
chore: remove usage of load_tiktoken_bpe ( #2276 )
2025-06-02 07:33:37 -07:00
llama3_1
chore: enable pyupgrade fixes ( #1806 )
2025-05-01 14:23:50 -07:00
llama3_2
refactor: move all llama code to models/llama out of meta reference ( #1887 )
2025-04-07 15:03:58 -07:00
llama3_3
ci: add python package build test ( #2457 )
2025-06-19 18:57:32 +05:30
llama4
ci: add python package build test ( #2457 )
2025-06-19 18:57:32 +05:30
resources
feat: introduce llama4 support ( #1877 )
2025-04-05 11:53:35 -07:00
__init__.py
feat: introduce llama4 support ( #1877 )
2025-04-05 11:53:35 -07:00
checkpoint.py
chore: enable pyupgrade fixes ( #1806 )
2025-05-01 14:23:50 -07:00
datatypes.py
fix: finish conversion to StrEnum ( #2514 )
2025-06-26 08:01:26 +05:30
hadamard_utils.py
refactor: move all llama code to models/llama out of meta reference ( #1887 )
2025-04-07 15:03:58 -07:00
prompt_format.py
chore: enable pyupgrade fixes ( #1806 )
2025-05-01 14:23:50 -07:00
quantize_impls.py
chore: enable pyupgrade fixes ( #1806 )
2025-05-01 14:23:50 -07:00
sku_list.py
chore: more mypy fixes ( #2029 )
2025-05-06 09:52:31 -07:00
sku_types.py
chore: enable pyupgrade fixes ( #1806 )
2025-05-01 14:23:50 -07:00
tokenizer_utils.py
chore: remove usage of load_tiktoken_bpe ( #2276 )
2025-06-02 07:33:37 -07:00