Fix for safety

This commit is contained in:
Ashwin Bharambe 2024-09-17 19:56:58 -07:00
parent 9487ad8294
commit 25adc83de8
2 changed files with 2 additions and 2 deletions

View file

@ -500,7 +500,7 @@ You know what's even more hilarious? People like you who think they can just Goo
Similarly you can test safety (if you configured llama-guard and/or prompt-guard shields) by: Similarly you can test safety (if you configured llama-guard and/or prompt-guard shields) by:
``` ```
python -m llama_stack.safety.client localhost 5000 python -m llama_stack.apis.safety.client localhost 5000
``` ```
You can find more example scripts with client SDKs to talk with the Llama Stack server in our [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main/sdk_examples) repo. You can find more example scripts with client SDKs to talk with the Llama Stack server in our [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main/sdk_examples) repo.

View file

@ -8,7 +8,7 @@ from typing import List
from llama_models.llama3.api.datatypes import Message from llama_models.llama3.api.datatypes import Message
from llama_stack.safety.meta_reference.shields.base import ( from llama_stack.providers.impls.meta_reference.safety.shields.base import (
OnViolationAction, OnViolationAction,
ShieldBase, ShieldBase,
ShieldResponse, ShieldResponse,