Krrish Dholakia
d57995a73d
build(schema.prisma): use jsonProtocol to fix db connection issues
...
https://github.com/prisma/prisma/discussions/19978
2024-03-21 18:01:45 -07:00
Krrish Dholakia
bc17404055
bump: version 1.33.2 → 1.33.3
2024-03-21 17:56:26 -07:00
Krrish Dholakia
4ac14a4e85
build(networking.tsx): fix trailing slash
2024-03-21 17:50:59 -07:00
Krrish Dholakia
d9577c3e2b
build(networking.tsx): modify url to prevent redirects
2024-03-21 17:43:41 -07:00
Krrish Dholakia
abe8d7c921
docs(configs.md): add disable swagger ui env tutorial to docs
2024-03-21 17:16:52 -07:00
Krrish Dholakia
33964233a5
fix(proxy_server.py): allow user to disable swagger ui docs via env
...
user can disable swagger ui docs by setting 'NO_DOCS="True"' in their env
2024-03-21 17:15:18 -07:00
Krrish Dholakia
44a91fe43a
bump: version 1.33.1 → 1.33.2
2024-03-21 16:55:52 -07:00
Krrish Dholakia
b5457beba6
fix(llm_guard.py): await moderation check
2024-03-21 16:55:28 -07:00
Krrish Dholakia
d7b502bf64
bump: version 1.33.0 → 1.33.1
2024-03-21 11:23:02 -07:00
Krrish Dholakia
c4dad3f34f
fix(llm_guard.py): more logging for llm guard.py
2024-03-21 11:22:52 -07:00
Krrish Dholakia
550c9508d3
bump: version 1.32.9 → 1.33.0
2024-03-21 10:56:58 -07:00
Krrish Dholakia
af27a61d76
refactor(main.py): trigger new build
2024-03-21 10:56:44 -07:00
Krish Dholakia
4df79ff4db
Merge pull request #2614 from BerriAI/litellm_llm_api_prompt_injection_check
...
feat(proxy_server.py): enable llm api based prompt injection checks
2024-03-21 09:57:16 -07:00
Krish Dholakia
33a433eb0a
Merge branch 'main' into litellm_llm_api_prompt_injection_check
2024-03-21 09:57:10 -07:00
Krrish Dholakia
0521e8a1d9
fix(prompt_injection_detection.py): fix type check
2024-03-21 08:56:13 -07:00
Krrish Dholakia
42d62cf99b
test(test_llm_guard.py): fix llm guard integration
2024-03-21 08:31:11 -07:00
Krrish Dholakia
84a540f2d6
build: fix mypy build issues
2024-03-21 08:27:23 -07:00
Krrish Dholakia
8e8c4e214e
fix: fix linting issue
2024-03-21 08:19:09 -07:00
Krrish Dholakia
e904a84bdb
build: reintegrate mypy linting in pre-commit hook
2024-03-21 08:09:02 -07:00
Krrish Dholakia
2ce5de903f
fix: fix linting issue
2024-03-21 08:05:47 -07:00
Ishaan Jaff
bcd62034ed
Merge pull request #2563 from eltociear/patch-2
...
Update proxy_server.py
2024-03-21 07:29:33 -07:00
Krrish Dholakia
d91f9a9f50
feat(proxy_server.py): enable llm api based prompt injection checks
...
run user calls through an llm api to check for prompt injection attacks. This happens in parallel to th
e actual llm call using `async_moderation_hook`
2024-03-20 22:43:42 -07:00
Ishaan Jaff
8363bd4e7e
Merge pull request #2613 from BerriAI/litellm_fix_quick_start_docker
...
(docs) Litellm fix quick start docker
2024-03-20 21:34:19 -07:00
Ishaan Jaff
354d13c7e1
(docs) fix litellm docker quick start
2024-03-20 21:34:01 -07:00
Ishaan Jaff
3612e47d6b
(fix) clean up quick start
2024-03-20 21:24:42 -07:00
Ishaan Jaff
28db804917
Merge pull request #2612 from BerriAI/litellm_update_vertex_ai_docs
...
(docs) add example using vertex ai on litellm proxy
2024-03-20 21:20:09 -07:00
Ishaan Jaff
ee1a0a9409
(docs) add example using openai compatib model
2024-03-20 21:15:50 -07:00
Ishaan Jaff
f7945a945b
(docs) add example using vertex ai on litellm proxy
2024-03-20 21:07:55 -07:00
Krish Dholakia
007d439017
Merge pull request #2606 from BerriAI/litellm_jwt_auth_updates
...
fix(handle_jwt.py): track spend for user using jwt auth
2024-03-20 19:40:17 -07:00
Krish Dholakia
cb23f2efa4
Merge pull request #2611 from BerriAI/litellm_prompt_injection_heuristics_fix
...
Ensure prompt injection attack 'known phrases' are >= 3 words
2024-03-20 19:39:14 -07:00
Krrish Dholakia
f24d3ffdb6
fix(proxy_server.py): fix import
2024-03-20 19:15:06 -07:00
Krrish Dholakia
3bb0e24cb7
fix(prompt_injection_detection.py): ensure combinations are actual phrases, not just 1-2 words
...
reduces misflagging
https://github.com/BerriAI/litellm/issues/2601
2024-03-20 19:09:38 -07:00
Krrish Dholakia
8bb00c4ae8
fix(caching.py): enable async setting of cache for dual cache
2024-03-20 18:42:34 -07:00
Ishaan Jaff
285084e4be
bump: version 1.32.8 → 1.32.9
2024-03-20 17:05:00 -07:00
Ishaan Jaff
7654400a10
Merge pull request #2609 from BerriAI/litellm_docs_using_cache_ping
...
[docs] using /cache/ping
2024-03-20 17:04:38 -07:00
Ishaan Jaff
5842486810
(docs) using /cache/ping
2024-03-20 17:03:26 -07:00
Ishaan Jaff
1d63e05e6c
bump: version 1.32.7 → 1.32.8
2024-03-20 16:54:23 -07:00
Ishaan Jaff
8aba161821
Merge pull request #2605 from BerriAI/litellm_fix_num_workers_issue
...
(fix) start proxy with default num_workers=1
2024-03-20 16:44:06 -07:00
Ishaan Jaff
c068a5fb74
(docs) deploying litellm
2024-03-20 11:47:55 -07:00
Krrish Dholakia
90e17b5422
fix(handle_jwt.py): track spend for user using jwt auth
2024-03-20 10:55:52 -07:00
Ishaan Jaff
3ad6e5ffc1
(feat) start proxy with default num_workers=1
2024-03-20 10:46:32 -07:00
Ishaan Jaff
cace0bd6fb
(fix) self.redis_version issue
2024-03-20 10:36:08 -07:00
Ishaan Jaff
7b29273b14
(fix) redis 6.2 version incompatibility issue
2024-03-20 09:38:21 -07:00
Krrish Dholakia
ca970a90c4
fix(handle_jwt.py): remove issuer check
2024-03-20 08:35:23 -07:00
Ishaan Jaff
909883ee04
Merge pull request #2602 from BerriAI/litellm_cache_ping
...
(feat) litellm proxy /cache/ping
2024-03-20 08:32:14 -07:00
Ishaan Jaff
4ed551dc52
(feat) better debugging for /cache/ping
2024-03-20 08:30:11 -07:00
Ishaan Jaff
2256ece5a9
(feat) litellm cache ping
2024-03-20 08:24:13 -07:00
Krrish Dholakia
88733fda5d
docs(prompt_injection.md): open sourcing prompt injection detection
2024-03-19 22:48:52 -07:00
Krrish Dholakia
524c244dd9
fix(utils.py): support response_format param for ollama
...
https://github.com/BerriAI/litellm/issues/2580
2024-03-19 21:07:20 -07:00
Krrish Dholakia
d6624bf6c3
refactor(main.py): trigger new build
2024-03-19 21:05:53 -07:00