Krrish Dholakia
|
7bc76ddbc3
|
feat(llm_guard.py): enable key-specific llm guard check
|
2024-03-26 17:21:51 -07:00 |
|
Krrish Dholakia
|
313f58c483
|
build(schema.prisma): update schema to enable team blocking
|
2024-03-26 17:03:32 -07:00 |
|
Krish Dholakia
|
4d53b484cb
|
Merge pull request #2675 from onukura/ollama-embedding
Fix Ollama embedding
|
2024-03-26 16:08:28 -07:00 |
|
Krish Dholakia
|
a8cdb82ef0
|
Merge pull request #2697 from antoniomdk/fix-database-credentials-leakage
(fix) Remove print statements from append_query_params
|
2024-03-26 16:07:33 -07:00 |
|
Krish Dholakia
|
d51f12ca44
|
Merge pull request #2704 from BerriAI/litellm_jwt_auth_improvements_3
fix(handle_jwt.py): enable team-based jwt-auth access
|
2024-03-26 16:06:56 -07:00 |
|
Krrish Dholakia
|
996fa82a2f
|
test(test_jwt.py): fix test
|
2024-03-26 15:22:05 -07:00 |
|
Krrish Dholakia
|
5ab34345e9
|
fix(proxy_server.py): rename proxy roles param to litellm_jwtauth
|
2024-03-26 15:04:30 -07:00 |
|
Krrish Dholakia
|
4028f935a5
|
fix(utils.py): check if item in list is pydantic object or dict before dereferencing
|
2024-03-26 14:39:16 -07:00 |
|
Krrish Dholakia
|
a0f55b92e6
|
fix(handle_jwt.py): support public key caching ttl param
|
2024-03-26 14:32:55 -07:00 |
|
Krrish Dholakia
|
d69ae350b4
|
fix(proxy_server.py): check if team scope in jwt
|
2024-03-26 14:01:02 -07:00 |
|
Krrish Dholakia
|
c21e954c6f
|
test(test_batch_completions.py): handle anthropic overloaded error
|
2024-03-26 13:55:03 -07:00 |
|
Krrish Dholakia
|
a045de8c7c
|
test(test_batch_completions.py): handle overloaded anthropic error
|
2024-03-26 13:53:18 -07:00 |
|
Krrish Dholakia
|
05fddcb06b
|
fix(utils.py): don't send subsequent chunks if last chunk sent
prevents multiple empty finish chunks from being sent
|
2024-03-26 13:49:42 -07:00 |
|
Krrish Dholakia
|
b9180a8c72
|
fix(handle_jwt.py): enable team-based jwt-auth access
Move auth to check on ‘client_id’ not ‘sub
|
2024-03-26 12:25:38 -07:00 |
|
Ishaan Jaff
|
a1e8f9fd46
|
Merge pull request #2703 from BerriAI/litellm_remove_litellm_telemetry
[Fix] remove litellm telemetry from proxy server
|
2024-03-26 11:35:18 -07:00 |
|
Ishaan Jaff
|
f5f7e344c7
|
Merge branch 'main' into litellm_remove_litellm_telemetry
|
2024-03-26 11:35:02 -07:00 |
|
Ishaan Jaff
|
15709e2805
|
(fix) telemetry = false
|
2024-03-26 11:23:23 -07:00 |
|
Ishaan Jaff
|
62c83d36a5
|
(fix) remove litellm.telemetry
|
2024-03-26 11:21:09 -07:00 |
|
Ishaan Jaff
|
d321f6f638
|
(docs) switch of litellm telemetry
|
2024-03-26 11:19:55 -07:00 |
|
Krish Dholakia
|
3f33365e04
|
Update README.md
|
2024-03-26 10:34:16 -07:00 |
|
Krrish Dholakia
|
713738c9e6
|
test(test_router_debug_logs.py): add info statement to log test
|
2024-03-26 09:54:26 -07:00 |
|
Krrish Dholakia
|
44bc79aadb
|
fix(utils.py): check if message is pydantic object or dict before dereferencing
|
2024-03-26 09:47:44 -07:00 |
|
Ishaan Jaff
|
04eca9de84
|
Merge pull request #2702 from BerriAI/litellm_cache_flush
[Feat] Proxy - /cache/flushall - delete all elements from cache
|
2024-03-26 09:34:39 -07:00 |
|
Ishaan Jaff
|
1a2ec398a8
|
(fix) doc string
|
2024-03-26 09:25:44 -07:00 |
|
Ishaan Jaff
|
73f38f1b9a
|
(fix) undo change from other branch
|
2024-03-26 09:24:12 -07:00 |
|
Ishaan Jaff
|
ee54bbcd89
|
(fix) undo changes from other branches
|
2024-03-26 09:22:19 -07:00 |
|
Ishaan Jaff
|
f1ebbd32b8
|
(feat) /cache/flushall
|
2024-03-26 09:18:58 -07:00 |
|
Ishaan Jaff
|
237440cf13
|
(feat) support cache flush on redis
|
2024-03-26 09:12:30 -07:00 |
|
Ishaan Jaff
|
336fe2f876
|
(fix) in mem redis reads
|
2024-03-26 09:10:49 -07:00 |
|
Krrish Dholakia
|
1137264d99
|
test(test_streaming.py): add unit testing for custom stream wrapper
|
2024-03-26 08:57:44 -07:00 |
|
Antonio Molner Domenech
|
22629898c9
|
Update print statements to use verbose logger and DEBUG level
|
2024-03-26 22:41:28 +07:00 |
|
Krish Dholakia
|
3fef983439
|
Merge pull request #2656 from TashaSkyUp/patch-1
fix for: when using ModelResponse.json() to save and then reconstruct a ModelResponse the choices field ends up empty
|
2024-03-26 08:36:55 -07:00 |
|
Krrish Dholakia
|
00d27a324d
|
fix(router.py): check for context window error when handling 400 status code errors
was causing proxy context window fallbacks to not work as expected
|
2024-03-26 08:08:15 -07:00 |
|
Ishaan Jaff
|
9668824d77
|
(test) claude-1 api is unstable
|
2024-03-26 08:07:16 -07:00 |
|
Ishaan Jaff
|
ebdec4d262
|
(fix) cache control logic
|
2024-03-26 07:36:45 -07:00 |
|
Ishaan Jaff
|
bf5b55df69
|
(fix) prod.md
|
2024-03-25 22:30:22 -07:00 |
|
Ishaan Jaff
|
f7f0c56f4c
|
(kub) always pull litellm image
|
2024-03-25 22:29:23 -07:00 |
|
Ishaan Jaff
|
7bf9cb3c54
|
(fix) cache control logic
|
2024-03-25 22:19:34 -07:00 |
|
Krrish Dholakia
|
aafbaad249
|
bump: version 1.34.3 → 1.34.4
|
2024-03-25 21:58:49 -07:00 |
|
Krish Dholakia
|
d620d94134
|
Merge pull request #2692 from BerriAI/litellm_streaming_fixes
fix(utils.py): ensure last chunk is always empty delta w/ finish reason
|
2024-03-25 21:57:04 -07:00 |
|
Krrish Dholakia
|
eda65a25e3
|
test(test_caching.py): fix test_redis_cache_acompletion_stream
|
2024-03-25 21:36:47 -07:00 |
|
Krrish Dholakia
|
be055f31ee
|
test(test_azure_astreaming_and_function_calling): fix test to handle caching
|
2024-03-25 19:33:57 -07:00 |
|
Krrish Dholakia
|
f604a6155f
|
fix(utils.py): persist system fingerprint across chunks
|
2024-03-25 19:24:09 -07:00 |
|
Krrish Dholakia
|
e9096ee922
|
fix(test_amazing_vertex_completion.py): fix test to check if content is none
|
2024-03-25 19:11:39 -07:00 |
|
Krish Dholakia
|
a4eaaa8bdb
|
Update README.md
|
2024-03-25 19:03:48 -07:00 |
|
Krrish Dholakia
|
c5bd4d4233
|
fix(utils.py): log success event for streaming
|
2024-03-25 19:03:10 -07:00 |
|
Ishaan Jaff
|
e8c5dabdb9
|
bump: version 1.34.2 → 1.34.3
|
2024-03-25 18:58:40 -07:00 |
|
Ishaan Jaff
|
272b9be9cf
|
Merge pull request #2695 from BerriAI/litellm_fix_cache_controls
(test) Using Cache controls with litellm - `no-cache=True`
|
2024-03-25 18:58:05 -07:00 |
|
Ishaan Jaff
|
dd6dedd0aa
|
(test) no cache hit
|
2024-03-25 18:56:36 -07:00 |
|
Ishaan Jaff
|
0c55e83c9d
|
Merge pull request #2694 from BerriAI/litellm_better_cache_debugging
[Fix] better cache debugging/error statements
|
2024-03-25 18:47:54 -07:00 |
|