Commit graph

4502 commits

Author SHA1 Message Date
Krrish Dholakia
f6e52ac771 test: handle api errors for gemini/palm testing 2024-02-21 21:44:08 -08:00
Krrish Dholakia
c9c6547ef9 test(test_streaming.py): handle gemini 500 error 2024-02-21 21:32:03 -08:00
Krrish Dholakia
6ba1a5f6b2 fix(utils.py): add exception mapping for gemini 2024-02-21 21:31:26 -08:00
Krrish Dholakia
fb2ae3a032 fix(utils.py): only return cached streaming object for streaming calls 2024-02-21 21:27:40 -08:00
Krrish Dholakia
f1742769a2 fix(utils.py): add palm exception mapping for 500 internal server error 2024-02-21 21:18:03 -08:00
Krrish Dholakia
2d62dee712 fix(utils.py): enable streaming cache logging 2024-02-21 21:10:58 -08:00
Krrish Dholakia
b011c8b93a test(test_completion.py): handle palm failing 2024-02-21 20:44:44 -08:00
Krrish Dholakia
d2d9e63176 test(test_custom_callback_input.py): fix test 2024-02-21 20:32:39 -08:00
Krrish Dholakia
f8b233b653 fix(utils.py): support streaming cached response logging 2024-02-21 17:53:14 -08:00
Krish Dholakia
0733bf1e7a
Merge pull request #2119 from BerriAI/litellm_updated_team_endpoints
Enable `/team/update`, `/team/delete` endpoints + create teams with user defined roles
2024-02-21 17:24:58 -08:00
Krrish Dholakia
a7229c9253 fix(proxy_server.py): enable proxy /team/delete endpoint 2024-02-21 16:53:12 -08:00
Krrish Dholakia
55a02c1a31 fix(proxy_server.py): enable /team/update endpoint for adding / deleting users from team 2024-02-21 14:47:52 -08:00
Ishaan Jaff
6561af5656
Merge pull request #2120 from BerriAI/litellm_admin_ui_fix
[FIX] Admin UI fixes - when using UI_USERNAME, UI_PASSWORD
2024-02-21 14:07:59 -08:00
ishaan-jaff
62b415a7d5 (fix) use UI USERNAME 2024-02-21 13:35:06 -08:00
ishaan-jaff
6a7250a84c (fix) admin ui with UI_Username 2024-02-21 13:29:55 -08:00
Krrish Dholakia
846757e343 fix: show all teams user is a part of in user_info 2024-02-21 13:29:42 -08:00
ishaan-jaff
dc06205964 (fix) ui allow users to make ui chat requests 2024-02-21 13:25:52 -08:00
Krish Dholakia
851473b71a
Merge pull request #1969 from kan-bayashi/fix/support-multiple-tools-in-gemini
fix: fix the issues when using tools in gemini
2024-02-21 11:46:26 -08:00
ishaan-jaff
de9336e1fc (fix) fix admin ui not working 2024-02-21 08:46:16 -08:00
Krrish Dholakia
5a0f962beb fix(router.py): fix debug log 2024-02-21 08:45:42 -08:00
ishaan-jaff
5734ab6b56 (feat) new ui build 2024-02-20 21:13:45 -08:00
Krish Dholakia
ccdf85ed48
Merge pull request #2106 from BerriAI/litellm_llm_guard_content_mod
fix(llm_guard.py): add streaming hook for moderation calls
2024-02-20 21:11:10 -08:00
Krrish Dholakia
54f7319be9 refactor(main.py): trigger new build 2024-02-20 20:36:33 -08:00
Krrish Dholakia
49847347d0 fix(llm_guard.py): add streaming hook for moderation calls 2024-02-20 20:31:32 -08:00
Krish Dholakia
6b794a8b6a
Merge pull request #2100 from BerriAI/litellm_presidio_ad_hoc_recognizers
fix(presidio_pii_masking.py): enable user to pass their own ad hoc recognizers to presidio
2024-02-20 19:45:29 -08:00
Krish Dholakia
162291a530
Merge pull request #2103 from BerriAI/litellm_fix_async_sagemaker
fix(sagemaker.py): fix async sagemaker calls
2024-02-20 19:44:27 -08:00
Krish Dholakia
24ad5c4f7f
Merge pull request #2090 from BerriAI/litellm_gemini_streaming_fixes
fix(gemini.py): fix async streaming + add native async completions
2024-02-20 19:07:58 -08:00
ishaan-jaff
476f401b74 (feat) update non default values 2024-02-20 18:55:20 -08:00
Krrish Dholakia
49c4aa5e75 fix(sagemaker.py): fix async sagemaker calls
https://github.com/BerriAI/litellm/issues/2086
2024-02-20 17:20:01 -08:00
Krrish Dholakia
1d3bef2e9c fix(gemini.py): implement custom streamer 2024-02-20 17:10:51 -08:00
Krrish Dholakia
d706d3b672 fix(presidio_pii_masking.py): enable user to pass ad hoc recognizer for pii masking 2024-02-20 16:01:15 -08:00
ishaan-jaff
0e8f639e5d (feat) proxy - view all 2024-02-20 15:37:39 -08:00
Krrish Dholakia
72bcd5a4af fix(presidio_pii_masking.py): enable user to pass their own ad hoc recognizers to presidio 2024-02-20 15:19:31 -08:00
ishaan-jaff
6546b43e5c (fix) ui build issues 2024-02-20 13:24:30 -08:00
ishaan-jaff
4e3f85907d (fix) ui build issues 2024-02-20 13:24:21 -08:00
ishaan-jaff
1dfb302dc7 (feat) new ui build 2024-02-20 13:18:18 -08:00
Krrish Dholakia
7b641491a2 fix(utils.py): fix print statement 2024-02-19 23:00:41 -08:00
Krrish Dholakia
b3886b4bda refactor(main.py): trigger new build 2024-02-19 22:53:38 -08:00
Krish Dholakia
e2da1e8ac5
Merge pull request #2087 from BerriAI/litellm_llm_guard_integration
feat(llm_guard.py): support llm guard for content moderation
2024-02-19 22:48:12 -08:00
Krrish Dholakia
45eb4a5fcc fix(gemini.py): fix async streaming + add native async completions 2024-02-19 22:41:36 -08:00
ishaan-jaff
9c4b570f6e (feat) proxy allow user/request_model 2024-02-19 20:56:28 -08:00
Krrish Dholakia
14513af2e2 feat(llm_guard.py): support llm guard for content moderation
https://github.com/BerriAI/litellm/issues/2056
2024-02-19 20:51:25 -08:00
ishaan-jaff
c5ac2f13f3 (fix) user_id_models on sso/callback 2024-02-19 20:39:17 -08:00
ishaan-jaff
be73bad9f6 (feat) /user/get_requests 2024-02-19 20:39:17 -08:00
ishaan-jaff
cdbfac4c07 (feat) use user_notification table 2024-02-19 20:39:17 -08:00
ishaan-jaff
b989f19e6e (feat) proxy - save model access requests 2024-02-19 20:39:17 -08:00
Krrish Dholakia
67e93a1865 test(test_presidio_pii_masking.py): add more unit tests 2024-02-19 20:39:17 -08:00
Krrish Dholakia
9fa4dfbdd3 test(test_presidio_pii_masking.py): add more unit tests 2024-02-19 16:30:44 -08:00
Ishaan Jaff
f8a204c101
Merge pull request #2066 from BerriAI/litellm_show_if_user_has_model_access
[FEAT] /model/info show models user has access to
2024-02-19 15:00:17 -08:00
ishaan-jaff
050114bb29 (fix) use "/v2/model/info", 2024-02-19 13:29:19 -08:00