llama-stack-mirror/llama_stack/providers
Mustafa Elbehery b5b5f5b9ae
chore: add mypy prompt guard (#2678)
# What does this PR do?
<!-- Provide a short summary of what this PR does and why. Link to
relevant issues if applicable. -->
This PR adds static type coverage to `llama-stack`

Part of https://github.com/meta-llama/llama-stack/issues/2647

<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

## Test Plan
<!-- Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.* -->

Signed-off-by: Mustafa Elbehery <melbeher@redhat.com>
2025-08-11 08:40:40 -07:00
..
inline chore: add mypy prompt guard (#2678) 2025-08-11 08:40:40 -07:00
registry feat: Add Google Vertex AI inference provider support (#2841) 2025-08-11 08:22:04 -04:00
remote feat: Add Google Vertex AI inference provider support (#2841) 2025-08-11 08:22:04 -04:00
utils fix: telemetry logger spams when queue is full (#3070) 2025-08-08 13:47:36 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00