forked from phoenix/litellm-mirror
Merge branch 'main' into litellm_llamaguard_custom_categories
This commit is contained in:
commit
038ba426ab
57 changed files with 585 additions and 364 deletions
|
@ -6,9 +6,4 @@ Code in this folder is licensed under a commercial license. Please review the [L
|
|||
|
||||
👉 **Using in an Enterprise / Need specific features ?** Meet with us [here](https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat?month=2024-02)
|
||||
|
||||
## Enterprise Features:
|
||||
|
||||
- Track, View spend per tag https://docs.litellm.ai/docs/proxy/spend
|
||||
- Custom API / microservice callbacks
|
||||
- Google Text Moderation API
|
||||
|
||||
See all Enterprise Features here 👉 [Docs](https://docs.litellm.ai/docs/proxy/enterprise)
|
||||
|
|
|
@ -110,7 +110,6 @@ class _ENTERPRISE_LlamaGuard(CustomLogger):
|
|||
-1
|
||||
] # get the last response - llama guard has a 4k token limit
|
||||
self.set_custom_prompt_template(messages=[safety_check_messages])
|
||||
# print(f"self.model: {self.model}")
|
||||
response = await litellm.acompletion(
|
||||
model=self.model,
|
||||
messages=[safety_check_messages],
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue