llama-stack-mirror/llama_stack/providers
slekkala1 7519ab4024
feat: Code scanner Provider impl for moderations api (#3100)
# What does this PR do?
Add CodeScanner implementations

## Test Plan
`SAFETY_MODEL=CodeScanner LLAMA_STACK_CONFIG=starter uv run pytest -v
tests/integration/safety/test_safety.py
--text-model=llama3.2:3b-instruct-fp16
--embedding-model=all-MiniLM-L6-v2 --safety-shield=ollama`

This PR need to land after this
https://github.com/meta-llama/llama-stack/pull/3098
2025-08-18 14:15:40 -07:00
..
inline feat: Code scanner Provider impl for moderations api (#3100) 2025-08-18 14:15:40 -07:00
registry feat: add batches API with OpenAI compatibility (with inference replay) (#3162) 2025-08-15 15:34:15 -07:00
remote fix: Dell distribution missing kvstore (#3113) 2025-08-13 06:18:25 -07:00
utils fix(misc): pin openai dependency to < 1.100.0 (#3192) 2025-08-18 12:20:50 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00