.. |
prompt_templates
|
ci: add python package build test (#2457)
|
2025-06-19 18:57:32 +05:30 |
quantization
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
vision
|
ci: add python package build test (#2457)
|
2025-06-19 18:57:32 +05:30 |
__init__.py
|
feat: introduce llama4 support (#1877)
|
2025-04-05 11:53:35 -07:00 |
args.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
chat_format.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
datatypes.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
ffn.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
generation.py
|
chore: make cprint write to stderr (#2250)
|
2025-05-24 23:39:57 -07:00 |
model.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
moe.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
preprocess.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
prompt_format.md
|
fix: llama4 tool use prompt fix (#2103)
|
2025-05-06 22:18:31 -07:00 |
prompts.py
|
chore: enable pyupgrade fixes (#1806)
|
2025-05-01 14:23:50 -07:00 |
tokenizer.model
|
feat(pre-commit): enhance pre-commit hooks with additional checks (#2014)
|
2025-04-30 11:35:49 -07:00 |
tokenizer.py
|
chore: remove usage of load_tiktoken_bpe (#2276)
|
2025-06-02 07:33:37 -07:00 |