mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-26 01:12:59 +00:00
# What does this PR do? ## Test Plan ``` ~/projects/lst3 remotes/origin/HEAD* .venv ❯ curl http://localhost:8321/v1/moderations \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4o-mini", "input": [ "hello" ] }' {"detail":"Invalid value: No shield associated with provider_resource id gpt-4o-mini: choose from ['together/meta-llama/Llama-Guard-4-12B']"} ``` |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| datasets.py | ||
| eval_scoring.py | ||
| inference.py | ||
| safety.py | ||
| tool_runtime.py | ||
| vector_io.py | ||