Krish Dholakia
|
934a9ac2b4
|
Merge pull request #2722 from BerriAI/litellm_db_perf_improvement
feat(proxy/utils.py): enable updating db in a separate server
|
2024-03-28 14:56:14 -07:00 |
|
Krrish Dholakia
|
47ca223d0b
|
fix(lowest_tpm_rpm_routing.py): fix base case where max tpm/rpm is 0
|
2024-03-28 14:51:31 -07:00 |
|
Krrish Dholakia
|
e8d80509b1
|
test(test_update_spend.py): allow db_client to be none
|
2024-03-28 13:44:40 -07:00 |
|
Ishaan Jaff
|
365497e860
|
(fix) OpenAI img gen endpoints unstable
|
2024-03-28 12:42:04 -07:00 |
|
Ishaan Jaff
|
6d408dcce7
|
(fix) test aimg gen on router
|
2024-03-28 12:27:26 -07:00 |
|
Krrish Dholakia
|
9ef7afd2b4
|
test(test_completion.py): skip unresponsive endpoint
|
2024-03-27 20:12:22 -07:00 |
|
Krrish Dholakia
|
9b7383ac67
|
fix(utils.py): don't run post-call rules on a coroutine function
|
2024-03-27 13:16:27 -07:00 |
|
Krish Dholakia
|
c1f8d346b8
|
Merge pull request #2706 from BerriAI/litellm_key_llm_guardrails
feat(llm_guard.py): enable key-specific llm guard check
|
2024-03-26 19:02:11 -07:00 |
|
Krish Dholakia
|
e266142d2b
|
Merge pull request #2705 from BerriAI/litellm_permissions_table
enable new `/team/disable` endpoint
|
2024-03-26 18:47:34 -07:00 |
|
Krrish Dholakia
|
4488480188
|
test(test_llm_guard.py): fix test
|
2024-03-26 18:37:27 -07:00 |
|
Krrish Dholakia
|
f62f642393
|
test(test_llm_guard.py): fix test
|
2024-03-26 18:13:15 -07:00 |
|
Krrish Dholakia
|
5b66cb3864
|
test(test_exceptions.py): handle api instability
|
2024-03-26 18:06:49 -07:00 |
|
Krrish Dholakia
|
1046a63521
|
test(test_llm_guard.py): unit testing for key-level llm guard enabling
|
2024-03-26 17:55:53 -07:00 |
|
Krish Dholakia
|
0ab708e6f1
|
Merge pull request #2704 from BerriAI/litellm_jwt_auth_improvements_3
fix(handle_jwt.py): enable team-based jwt-auth access
|
2024-03-26 16:06:56 -07:00 |
|
Krrish Dholakia
|
ea8f6672c5
|
test(test_jwt.py): fix test
|
2024-03-26 15:22:05 -07:00 |
|
Krrish Dholakia
|
752516df1b
|
fix(handle_jwt.py): support public key caching ttl param
|
2024-03-26 14:32:55 -07:00 |
|
Krrish Dholakia
|
4d7f4550e2
|
test(test_batch_completions.py): handle anthropic overloaded error
|
2024-03-26 13:55:03 -07:00 |
|
Krrish Dholakia
|
6b1d2551d1
|
test(test_batch_completions.py): handle overloaded anthropic error
|
2024-03-26 13:53:18 -07:00 |
|
Krrish Dholakia
|
3a82ff2ef2
|
fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
|
2024-03-26 13:49:42 -07:00 |
|
Krrish Dholakia
|
b4d0a95cff
|
test(test_router_debug_logs.py): add info statement to log test
|
2024-03-26 09:54:26 -07:00 |
|
Ishaan Jaff
|
2ecdd92619
|
Merge pull request #2702 from BerriAI/litellm_cache_flush
[Feat] Proxy - /cache/flushall - delete all elements from cache
|
2024-03-26 09:34:39 -07:00 |
|
Krrish Dholakia
|
2dd2b8a8e3
|
test(test_streaming.py): add unit testing for custom stream wrapper
|
2024-03-26 08:57:44 -07:00 |
|
Krrish Dholakia
|
49e8cdbff9
|
fix(router.py): check for context window error when handling 400 status code errors
was causing proxy context window fallbacks to not work as expected
|
2024-03-26 08:08:15 -07:00 |
|
Ishaan Jaff
|
787c9b7df0
|
(test) claude-1 api is unstable
|
2024-03-26 08:07:16 -07:00 |
|
Krish Dholakia
|
eb859f5dba
|
Merge pull request #2692 from BerriAI/litellm_streaming_fixes
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
|
2024-03-25 21:57:04 -07:00 |
|
Krrish Dholakia
|
643fd6ac96
|
test(test_caching.py): fix test_redis_cache_acompletion_stream
|
2024-03-25 21:36:47 -07:00 |
|
Krrish Dholakia
|
4d85387b5a
|
test(test_azure_astreaming_and_function_calling): fix test to handle caching
|
2024-03-25 19:33:57 -07:00 |
|
Krrish Dholakia
|
fa297b67ca
|
fix(test_amazing_vertex_completion.py): fix test to check if content is none
|
2024-03-25 19:11:39 -07:00 |
|
Krrish Dholakia
|
bd75498913
|
fix(utils.py): log success event for streaming
|
2024-03-25 19:03:10 -07:00 |
|
Ishaan Jaff
|
07fe08d8b5
|
(test) no cache hit
|
2024-03-25 18:56:36 -07:00 |
|
Krrish Dholakia
|
a5776a3054
|
test(test_custom_logger.py): cleanup test
|
2024-03-25 18:32:12 -07:00 |
|
Krrish Dholakia
|
1ac641165b
|
fix(utils.py): persist response id across chunks
|
2024-03-25 18:20:43 -07:00 |
|
Ishaan Jaff
|
3fcab0137a
|
(test) batch writing to cache
|
2024-03-25 18:04:04 -07:00 |
|
Krrish Dholakia
|
dc2c4af631
|
fix(utils.py): fix text completion streaming
|
2024-03-25 16:47:17 -07:00 |
|
Krrish Dholakia
|
9e1e97528d
|
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
makes sure we're openai-compatible with our streaming. Adds stricter tests for this as well
|
2024-03-25 16:33:41 -07:00 |
|
Krrish Dholakia
|
591a0a376e
|
fix(caching.py): support default ttl for caching
|
2024-03-25 13:40:17 -07:00 |
|
Krrish Dholakia
|
2e4e97a48f
|
test(test_jwt.py): add unit tests for jwt auth integration
|
2024-03-25 13:24:39 -07:00 |
|
Krrish Dholakia
|
93959ab5aa
|
fix(handle_jwt.py): allow setting proxy admin role string for jwt auth
|
2024-03-25 12:20:14 -07:00 |
|
Krrish Dholakia
|
f98aead602
|
feat(main.py): support router.chat.completions.create
allows using router with instructor
https://github.com/BerriAI/litellm/issues/2673
|
2024-03-25 08:26:28 -07:00 |
|
Krrish Dholakia
|
e8e7964025
|
docs(routing.md): add pre-call checks to docs
|
2024-03-23 19:10:34 -07:00 |
|
Krrish Dholakia
|
b7321ae4ee
|
fix(router.py): fix pre call check logic
|
2024-03-23 18:56:08 -07:00 |
|
Krrish Dholakia
|
eb3ca85d7e
|
feat(router.py): enable pre-call checks
filter models outside of context window limits of a given message for a model group
https://github.com/BerriAI/litellm/issues/872
|
2024-03-23 18:03:30 -07:00 |
|
Krrish Dholakia
|
2fabff06c0
|
fix(bedrock.py): fix supported openai params for bedrock claude 3
|
2024-03-23 16:02:15 -07:00 |
|
Krrish Dholakia
|
2a9fd4c28d
|
test(test_completion.py): make default claude 3 test message multi-turn
|
2024-03-23 14:34:42 -07:00 |
|
Krrish Dholakia
|
f0bee037ad
|
build(test_python_38.py): add testing for litellm cli import
|
2024-03-23 10:42:19 -07:00 |
|
Krrish Dholakia
|
9b951b906d
|
test(test_completion.py): fix claude multi-turn conversation test
|
2024-03-23 00:56:41 -07:00 |
|
Krrish Dholakia
|
42a7588b04
|
fix(anthropic.py): support async claude 3 tool calling + streaming
https://github.com/BerriAI/litellm/issues/2644
|
2024-03-22 19:57:01 -07:00 |
|
Krrish Dholakia
|
0521e8a1d9
|
fix(prompt_injection_detection.py): fix type check
|
2024-03-21 08:56:13 -07:00 |
|
Vincelwt
|
29e8c144fb
|
Merge branch 'main' into main
|
2024-03-22 00:52:42 +09:00 |
|
Krrish Dholakia
|
42d62cf99b
|
test(test_llm_guard.py): fix llm guard integration
|
2024-03-21 08:31:11 -07:00 |
|