llama-stack-mirror/llama_stack/providers/inline/safety
slekkala1 25e0553eed
chore: Change moderations api response to Provider returned categories (#3098)
# What does this PR do?
To be compliant with model policies for LLAMA, just return the
categories as is from provider, we will lose the OAI compat in
moderations api response.

<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

## Test Plan
`SAFETY_MODEL=llama-guard3:8b LLAMA_STACK_CONFIG=starter uv run pytest
-v tests/integration/safety/test_safety.py
--text-model=llama3.2:3b-instruct-fp16
--embedding-model=all-MiniLM-L6-v2 --safety-shield=ollama`
2025-08-13 09:47:35 -07:00
..
code_scanner chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
llama_guard chore: Change moderations api response to Provider returned categories (#3098) 2025-08-13 09:47:35 -07:00
prompt_guard chore: Change moderations api response to Provider returned categories (#3098) 2025-08-13 09:47:35 -07:00
__init__.py add missing inits 2024-11-08 17:54:24 -08:00