llama-stack-mirror/llama_stack/models
Mustafa Elbehery 2d3d9664a7
chore(api): add mypy coverage to prompts (#2657)
# What does this PR do?
<!-- Provide a short summary of what this PR does and why. Link to
relevant issues if applicable. -->
This PR adds static type coverage to `llama-stack`

Part of https://github.com/meta-llama/llama-stack/issues/2647

<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

## Test Plan
<!-- Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.* -->

Signed-off-by: Mustafa Elbehery <melbeher@redhat.com>
2025-07-09 10:07:00 +02:00
..
llama chore(api): add mypy coverage to prompts (#2657) 2025-07-09 10:07:00 +02:00
__init__.py feat: introduce llama4 support (#1877) 2025-04-05 11:53:35 -07:00