Commit graph

7637 commits

Author SHA1 Message Date
ishaan-jaff
62b415a7d5 (fix) use UI USERNAME 2024-02-21 13:35:06 -08:00
ishaan-jaff
6a7250a84c (fix) admin ui with UI_Username 2024-02-21 13:29:55 -08:00
ishaan-jaff
dc06205964 (fix) ui allow users to make ui chat requests 2024-02-21 13:25:52 -08:00
ishaan-jaff
9f753ab194 bump: version 1.26.6 → 1.26.7 2024-02-21 08:46:23 -08:00
ishaan-jaff
de9336e1fc (fix) fix admin ui not working 2024-02-21 08:46:16 -08:00
Krrish Dholakia
5a0f962beb fix(router.py): fix debug log 2024-02-21 08:45:42 -08:00
ishaan-jaff
7f179e13ee bump: version 1.26.5 → 1.26.6 2024-02-20 21:13:56 -08:00
ishaan-jaff
5734ab6b56 (feat) new ui build 2024-02-20 21:13:45 -08:00
ishaan-jaff
7419547ba1 (docs) model max budgets 2024-02-20 21:11:17 -08:00
Krish Dholakia
ccdf85ed48
Merge pull request #2106 from BerriAI/litellm_llm_guard_content_mod
fix(llm_guard.py): add streaming hook for moderation calls
2024-02-20 21:11:10 -08:00
Ishaan Jaff
2f7b2f07dd
Merge pull request #2107 from BerriAI/litellm_docs_router_use_correct_model_cost
(docs) Router - use correct base model for cost
2024-02-20 21:01:30 -08:00
ishaan-jaff
2b2e62477b (docs) use correct base model for cost 2024-02-20 21:00:43 -08:00
Krrish Dholakia
9657a88499 bump: version 1.26.4 → 1.26.5 2024-02-20 20:37:05 -08:00
Krrish Dholakia
54f7319be9 refactor(main.py): trigger new build 2024-02-20 20:36:33 -08:00
Krrish Dholakia
49847347d0 fix(llm_guard.py): add streaming hook for moderation calls 2024-02-20 20:31:32 -08:00
Krish Dholakia
6b794a8b6a
Merge pull request #2100 from BerriAI/litellm_presidio_ad_hoc_recognizers
fix(presidio_pii_masking.py): enable user to pass their own ad hoc recognizers to presidio
2024-02-20 19:45:29 -08:00
Krish Dholakia
162291a530
Merge pull request #2103 from BerriAI/litellm_fix_async_sagemaker
fix(sagemaker.py): fix async sagemaker calls
2024-02-20 19:44:27 -08:00
ishaan-jaff
0a5b8f0e4e bump: version 1.26.3 → 1.26.4 2024-02-20 19:08:13 -08:00
Krish Dholakia
24ad5c4f7f
Merge pull request #2090 from BerriAI/litellm_gemini_streaming_fixes
fix(gemini.py): fix async streaming + add native async completions
2024-02-20 19:07:58 -08:00
Ishaan Jaff
cb2ef26b34
Merge pull request #2104 from BerriAI/litellm_fix_updating_keys
[Fix] Unexpected Model Deletion in POST /key/update When Updating Team ID
2024-02-20 19:06:49 -08:00
ishaan-jaff
476f401b74 (feat) update non default values 2024-02-20 18:55:20 -08:00
Krrish Dholakia
49c4aa5e75 fix(sagemaker.py): fix async sagemaker calls
https://github.com/BerriAI/litellm/issues/2086
2024-02-20 17:20:01 -08:00
Ishaan Jaff
a4365926a1
Merge pull request #2102 from BerriAI/litellm_create_new_users
[FEAT] UI - Create Users on LiteLLM UI
2024-02-20 17:11:30 -08:00
Krrish Dholakia
1d3bef2e9c fix(gemini.py): implement custom streamer 2024-02-20 17:10:51 -08:00
ishaan-jaff
4238e4c82a (feat) ui - create new users 2024-02-20 17:10:33 -08:00
ishaan-jaff
02f7b902db (ui) create user 2024-02-20 16:41:51 -08:00
Krrish Dholakia
987cb38976 docs(pii_masking.md): docs for ad-hoc recognizers for pii masking 2024-02-20 16:08:34 -08:00
Krrish Dholakia
d706d3b672 fix(presidio_pii_masking.py): enable user to pass ad hoc recognizer for pii masking 2024-02-20 16:01:15 -08:00
Ishaan Jaff
c656eaf7a4
Merge pull request #2101 from BerriAI/litellm_allow_admin_create_users
[FEAT] UI - View all users
2024-02-20 15:42:04 -08:00
ishaan-jaff
0e8f639e5d (feat) proxy - view all 2024-02-20 15:37:39 -08:00
ishaan-jaff
3a662539bd (ui) show user table on admin UI 2024-02-20 15:34:10 -08:00
Krrish Dholakia
72bcd5a4af fix(presidio_pii_masking.py): enable user to pass their own ad hoc recognizers to presidio 2024-02-20 15:19:31 -08:00
ishaan-jaff
6546b43e5c (fix) ui build issues 2024-02-20 13:24:30 -08:00
ishaan-jaff
4e3f85907d (fix) ui build issues 2024-02-20 13:24:21 -08:00
ishaan-jaff
1dfb302dc7 (feat) new ui build 2024-02-20 13:18:18 -08:00
Krrish Dholakia
1227964155 docs(enterprise.md): clarify what's in enterprise 2024-02-20 08:43:28 -08:00
Krrish Dholakia
7b641491a2 fix(utils.py): fix print statement 2024-02-19 23:00:41 -08:00
Krrish Dholakia
e486440db8 bump: version 1.26.2 → 1.26.3 2024-02-19 22:55:18 -08:00
Krrish Dholakia
b3886b4bda refactor(main.py): trigger new build 2024-02-19 22:53:38 -08:00
Krish Dholakia
e2da1e8ac5
Merge pull request #2087 from BerriAI/litellm_llm_guard_integration
feat(llm_guard.py): support llm guard for content moderation
2024-02-19 22:48:12 -08:00
Krrish Dholakia
45eb4a5fcc fix(gemini.py): fix async streaming + add native async completions 2024-02-19 22:41:36 -08:00
ishaan-jaff
a301537c41 bump: version 1.26.1 → 1.26.2 2024-02-19 21:36:44 -08:00
Ishaan Jaff
61046bb67b
Merge pull request #2089 from BerriAI/litellm_admin_view_pending_requests
[FEAT] Admin UI - Approve / Deny user mode requests
2024-02-19 21:35:52 -08:00
ishaan-jaff
9683760241 (ui) show pendingRequests 2024-02-19 21:35:28 -08:00
ishaan-jaff
84652a82cb (feat) adin ui show all pending user requests 2024-02-19 21:23:59 -08:00
Krrish Dholakia
fde478f70b docs(enterprise.md): add llm guard to docs 2024-02-19 21:05:01 -08:00
Ishaan Jaff
45326c93dc
Merge pull request #2077 from BerriAI/litellm_request_model_access
[Feat] UI - allow a user to request access to a model
2024-02-19 20:57:02 -08:00
ishaan-jaff
9c4b570f6e (feat) proxy allow user/request_model 2024-02-19 20:56:28 -08:00
ishaan-jaff
0bd8b77fa0 (feat) ui allow a user to request access to a model 2024-02-19 20:55:03 -08:00
Krrish Dholakia
14513af2e2 feat(llm_guard.py): support llm guard for content moderation
https://github.com/BerriAI/litellm/issues/2056
2024-02-19 20:51:25 -08:00