litellm/enterprise/enterprise_hooks
Krrish Dholakia d91f9a9f50 feat(proxy_server.py): enable llm api based prompt injection checks
run user calls through an llm api to check for prompt injection attacks. This happens in parallel to th
e actual llm call using `async_moderation_hook`
2024-03-20 22:43:42 -07:00
..
banned_keywords.py feat(proxy_server.py): enable admin to set banned keywords on proxy 2024-02-22 18:30:42 -08:00
blocked_user_list.py fix(blocked_user_list.py): check if end user blocked in db 2024-03-16 13:03:52 -07:00
google_text_moderation.py feat(proxy_server.py): enable llm api based prompt injection checks 2024-03-20 22:43:42 -07:00
llama_guard.py feat(proxy_server.py): enable llm api based prompt injection checks 2024-03-20 22:43:42 -07:00
llm_guard.py feat(proxy_server.py): enable llm api based prompt injection checks 2024-03-20 22:43:42 -07:00