Krrish Dholakia
45fedb83c6
feat(proxy_server.py): allow admin to return rejected response as string to user
...
Closes https://github.com/BerriAI/litellm/issues/3671
2024-05-20 10:30:23 -07:00
Krrish Dholakia
926b86af87
feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls
2024-05-11 13:43:08 -07:00
Krrish Dholakia
5f93cae3ff
feat(proxy_server.py): return litellm version in response headers
2024-05-08 16:00:08 -07:00
Krrish Dholakia
376ee4e9d7
fix(test_lowest_tpm_rpm_routing_v2.py): unit testing for usage-based-routing-v2
2024-04-18 21:38:00 -07:00
Krrish Dholakia
7bc76ddbc3
feat(llm_guard.py): enable key-specific llm guard check
2024-03-26 17:21:51 -07:00
Krrish Dholakia
e9cc6b4cc9
feat(proxy_server.py): enable llm api based prompt injection checks
...
run user calls through an llm api to check for prompt injection attacks. This happens in parallel to th
e actual llm call using `async_moderation_hook`
2024-03-20 22:43:42 -07:00
Krrish Dholakia
38bcc910b7
fix: clean up print verbose statements
2024-03-05 15:01:03 -08:00
Krrish Dholakia
7089b13632
fix(llm_guard.py): add streaming hook for moderation calls
2024-02-20 20:31:32 -08:00
Krrish Dholakia
67cd9b1c63
feat(llama_guard.py): add llama guard support for content moderation + new async_moderation_hook
endpoint
2024-02-16 18:45:25 -08:00
Krrish Dholakia
66c0291640
fix: fix merge issues
2024-02-13 23:04:12 -08:00
Krish Dholakia
607c79a0ce
Merge branch 'main' into litellm_fix_pii_output_parsing
2024-02-13 22:36:17 -08:00
Krrish Dholakia
9936427669
feat(presidio_pii_masking.py): enable output parsing for pii masking
2024-02-13 21:36:57 -08:00
Krrish Dholakia
fd6f64a4ae
feat(utils.py): enable post call rules for streaming
2024-02-12 22:08:04 -08:00
Krrish Dholakia
79978c44ba
refactor: add black formatting
2023-12-25 14:11:20 +05:30
Krrish Dholakia
5160f06274
docs(custom_callback.md): add async failure + streaming logging events to docs
...
https://github.com/BerriAI/litellm/issues/1125
2023-12-14 10:46:53 -08:00
Krrish Dholakia
3fbeca134f
fix(custom_logger.py): enable pre_call hooks to modify incoming data to proxy
2023-12-13 16:20:37 -08:00
Krrish Dholakia
cf7b93e14b
refactor(custom_logger.py): add async log stream event function
2023-12-12 00:16:48 -08:00
Krrish Dholakia
a65c8919fc
fix(router.py): fix least-busy routing
2023-12-08 20:29:49 -08:00
ishaan-jaff
e56e7d1d16
(feat) Custom_logger add async success & async failure
2023-12-06 17:16:24 -08:00
ishaan-jaff
bac8125e5c
(feat) litellm - add _async_failure_callback
2023-12-06 14:43:47 -08:00
Krrish Dholakia
d1a525b6c9
feat(utils.py): add async success callbacks for custom functions
2023-12-04 16:42:40 -08:00
Krrish Dholakia
a2207d462e
feat(router.py): add server cooldown logic
2023-11-22 15:59:48 -08:00
Krrish Dholakia
e633566253
feat(utils.py): adding additional states for custom logging
2023-11-04 17:07:20 -07:00
ishaan-jaff
bdaff66973
(fix) allow using more than 1 custom callback
2023-10-19 09:11:58 -07:00
Krrish Dholakia
d0b4dfd26c
feat(proxy_server): adds create-proxy feature
2023-10-12 18:27:07 -07:00
ishaan-jaff
41d90f0ef7
custom_logger for litellm - callback_func
2023-09-09 18:41:41 -07:00
ishaan-jaff
b377655d8f
try/except completion_cost + custom logger func
2023-09-09 18:36:22 -07:00