mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-24 00:47:00 +00:00
# What does this PR do? <!-- Provide a short summary of what this PR does and why. Link to relevant issues if applicable. --> This PR adds static type coverage to `llama-stack` Part of https://github.com/meta-llama/llama-stack/issues/2647 <!-- If resolving an issue, uncomment and update the line below --> <!-- Closes #[issue-number] --> ## Test Plan <!-- Describe the tests you ran to verify your changes with result summaries. *Provide clear instructions so the plan can be easily re-executed.* --> Signed-off-by: Mustafa Elbehery <melbeher@redhat.com> |
||
|---|---|---|
| .. | ||
| multimodal | ||
| prompt_templates | ||
| quantization | ||
| __init__.py | ||
| args.py | ||
| chat_format.py | ||
| dog.jpg | ||
| generation.py | ||
| interface.py | ||
| model.py | ||
| pasta.jpeg | ||
| template_data.py | ||
| tokenizer.model | ||
| tokenizer.py | ||
| tool_utils.py | ||