Litellm dev 11 02 2024 (#6561)

* fix(dual_cache.py): update in-memory check for redis batch get cache

Fixes latency delay for async_batch_redis_cache

* fix(service_logger.py): fix race condition causing otel service logging to be overwritten if service_callbacks set

* feat(user_api_key_auth.py): add parent otel component for auth

allows us to isolate how much latency is added by auth checks

* perf(parallel_request_limiter.py): move async_set_cache_pipeline (from max parallel request limiter) out of execution path (background task)

reduces latency by 200ms

* feat(user_api_key_auth.py): have user api key auth object return user tpm/rpm limits - reduces redis calls in downstream task (parallel_request_limiter)

Reduces latency by 400-800ms

* fix(parallel_request_limiter.py): use batch get cache to reduce user/key/team usage object calls

reduces latency by 50-100ms

* fix: fix linting error

* fix(_service_logger.py): fix import

* fix(user_api_key_auth.py): fix service logging

* fix(dual_cache.py): don't pass 'self'

* fix: fix python3.8 error

* fix: fix init]
This commit is contained in:
Krish Dholakia 2024-11-04 07:48:20 +05:30 committed by GitHub
parent e5b4a71c79
commit cc19a9f6a1
17 changed files with 303 additions and 157 deletions

View file

@ -59,12 +59,15 @@ async def test_dual_cache_async_batch_get_cache():
redis_cache = RedisCache() # get credentials from environment
dual_cache = DualCache(in_memory_cache=in_memory_cache, redis_cache=redis_cache)
in_memory_cache.set_cache(key="test_value", value="hello world")
with patch.object(
dual_cache.redis_cache, "async_batch_get_cache", new=AsyncMock()
) as mock_redis_cache:
mock_redis_cache.return_value = {"test_value_2": None, "test_value": "hello"}
result = await dual_cache.async_batch_get_cache(keys=["test_value", "test_value_2"])
await dual_cache.async_batch_get_cache(keys=["test_value", "test_value_2"])
await dual_cache.async_batch_get_cache(keys=["test_value", "test_value_2"])
assert result[0] == "hello world"
assert result[1] == None
assert mock_redis_cache.call_count == 1
def test_dual_cache_batch_get_cache():