Ishaan Jaff
|
dcf50ff219
|
feat add cron job for sending stats from prometheus
|
2024-08-10 16:56:03 -07:00 |
|
Krrish Dholakia
|
c919c15c4a
|
fix(proxy_server.py): ensure internal_user params only apply to internal_user role
|
2024-08-08 17:50:29 -07:00 |
|
Krrish Dholakia
|
488a78e5f4
|
fix(proxy_server.py): respect internal_user_budget_duration for sso user
|
2024-08-08 17:28:28 -07:00 |
|
Krrish Dholakia
|
384f8f4f54
|
fix(anthropic.py): fix translation from /v1/messages format to openai format
|
2024-08-08 17:19:12 -07:00 |
|
Krrish Dholakia
|
eb292e0b31
|
feat: set max_internal_budget for user w/ sso
|
2024-08-08 12:44:07 -07:00 |
|
Ishaan Jaff
|
25e6733da3
|
support provider wildcard routing
|
2024-08-07 14:20:22 -07:00 |
|
Ishaan Jaff
|
7bcb088ac0
|
clean up unused func
|
2024-08-06 20:58:51 -07:00 |
|
Ishaan Jaff
|
22ca6d6d30
|
fix forward ui requests when base url set
|
2024-08-06 18:10:34 -07:00 |
|
Ishaan Jaff
|
e207ac577e
|
add redirect_ui_middleware
|
2024-08-06 18:01:23 -07:00 |
|
Ishaan Jaff
|
53451834bd
|
Merge pull request #5077 from BerriAI/litellm_fix_ui_emails
[Fix] Fix testing emails through Admin UI
|
2024-08-06 16:41:01 -07:00 |
|
Ishaan Jaff
|
7fbd0855b6
|
update alerting settings on ui
|
2024-08-06 16:05:33 -07:00 |
|
Ishaan Jaff
|
6a46899521
|
add debug statements on docker
|
2024-08-06 15:29:49 -07:00 |
|
Krrish Dholakia
|
5ffadd631d
|
feat(proxy_server.py): allow restricting allowed email domains for the UI
|
2024-08-06 12:53:03 -07:00 |
|
Ishaan Jaff
|
8ad43780cf
|
add debugging utils to print when connecting to prisma
|
2024-08-05 19:19:28 -07:00 |
|
Ishaan Jaff
|
357ca893f8
|
build ui on custom path
|
2024-08-05 16:34:37 -07:00 |
|
Ishaan Jaff
|
e6f4d7a81d
|
set PROXY_BASE_URL when server root path set
|
2024-08-05 15:48:44 -07:00 |
|
Krish Dholakia
|
88a2428ee6
|
Merge branch 'main' into litellm_anthropic_api_streaming
|
2024-08-03 21:16:50 -07:00 |
|
Krrish Dholakia
|
5810708c71
|
feat(anthropic_adapter.py): support streaming requests for /v1/messages endpoint
Fixes https://github.com/BerriAI/litellm/issues/5011
|
2024-08-03 20:16:19 -07:00 |
|
Ishaan Jaff
|
c206a9c2d7
|
set native vertex endpoints
|
2024-08-03 16:52:03 -07:00 |
|
Krrish Dholakia
|
e6bc7e938a
|
fix(utils.py): handle scenario where model="azure/*" and custom_llm_provider="azure"
Fixes https://github.com/BerriAI/litellm/issues/4912
|
2024-08-02 17:48:53 -07:00 |
|
Krrish Dholakia
|
e60bec92ac
|
feat(ui): add ability to enable traceloop + langsmith via ui
|
2024-07-31 21:40:29 -07:00 |
|
Krrish Dholakia
|
c5e38b3bec
|
feat(ui): add braintrust logging to ui
|
2024-07-31 21:15:19 -07:00 |
|
Ishaan Jaff
|
4f21ce2873
|
Merge pull request #4987 from BerriAI/litellm_add_ft_endpoints
[Feat-Proxy] Add List fine-tuning jobs
|
2024-07-31 16:49:59 -07:00 |
|
Ishaan Jaff
|
f90f3f28fb
|
fix fine tuning endpoint postion on swagger
|
2024-07-31 16:46:58 -07:00 |
|
Ishaan Jaff
|
8a266a0493
|
fix routes order
|
2024-07-31 16:36:50 -07:00 |
|
Ishaan Jaff
|
d7c0e0c471
|
allow setting files config
|
2024-07-31 15:28:41 -07:00 |
|
Krrish Dholakia
|
368af6975a
|
fix(http_handler.py): correctly re-raise timeout exception
|
2024-07-31 14:51:28 -07:00 |
|
Ishaan Jaff
|
967d18d03d
|
read ft config
|
2024-07-31 13:31:00 -07:00 |
|
Ishaan Jaff
|
2a87aa0463
|
fix endpoint to create fine tuning jobs
|
2024-07-31 12:44:01 -07:00 |
|
Ishaan Jaff
|
0a3dbe0659
|
fix linting errors
|
2024-07-30 12:51:39 -07:00 |
|
Ishaan Jaff
|
b64e0fad81
|
test batches endpoint on proxy
|
2024-07-30 09:46:30 -07:00 |
|
Ishaan Jaff
|
93f3143dc5
|
Merge pull request #4916 from BerriAI/litellm_fix_ui_login
Feat UI - allow using custom header for litellm api key
|
2024-07-29 17:08:53 -07:00 |
|
Ishaan Jaff
|
34ad3dec68
|
check litellm header in login on ui
|
2024-07-29 17:03:04 -07:00 |
|
Ishaan Jaff
|
096844c258
|
Merge pull request #4927 from BerriAI/litellm_set_max_request_response_size_ui
Feat Enterprise - set max request / response size UI
|
2024-07-27 20:06:09 -07:00 |
|
Ishaan Jaff
|
64bc224d63
|
Merge pull request #4928 from BerriAI/litellm_check_response_size
[Feat Enterprise] - check max response size
|
2024-07-27 17:03:56 -07:00 |
|
Ishaan Jaff
|
b5451eaf21
|
allow setting max request / response size on admin UI
|
2024-07-27 17:00:39 -07:00 |
|
Ishaan Jaff
|
805d04f7f3
|
feat check check_response_size_is_safe
|
2024-07-27 16:53:39 -07:00 |
|
Ishaan Jaff
|
aade38760d
|
use common helpers for writing to otel
|
2024-07-27 11:40:39 -07:00 |
|
Ishaan Jaff
|
cde46a4a09
|
feat - use log_to_opentelemetry for _PROXY_track_cost_callback
|
2024-07-27 11:08:22 -07:00 |
|
Ishaan Jaff
|
e3a66f2c62
|
feat - clearly show version litellm enterprise
|
2024-07-27 09:50:03 -07:00 |
|
Krish Dholakia
|
fb80839e8c
|
Merge pull request #4907 from BerriAI/litellm_proxy_get_secret
fix(proxy_server.py): fix get secret for environment_variables
|
2024-07-26 22:17:11 -07:00 |
|
Ishaan Jaff
|
56cf8e2798
|
feat link to model cost map on swagger
|
2024-07-26 21:34:42 -07:00 |
|
Ishaan Jaff
|
864f803ccf
|
fix for GET /v1/batches{batch_id:path}
|
2024-07-26 18:23:15 -07:00 |
|
Ishaan Jaff
|
2b889b83b3
|
fix /v1/batches POST
|
2024-07-26 18:06:00 -07:00 |
|
Krrish Dholakia
|
1a172b7636
|
fix(proxy_server.py): fix get secret for environment_variables
|
2024-07-26 13:33:02 -07:00 |
|
Krrish Dholakia
|
9d87767639
|
feat(proxy_server.py): handle pydantic mockselvar error
Fixes https://github.com/BerriAI/litellm/issues/4898#issuecomment-2252105485
|
2024-07-26 08:38:51 -07:00 |
|
Ishaan Jaff
|
1103c614a0
|
Merge branch 'main' into litellm_proxy_support_all_providers
|
2024-07-25 20:15:37 -07:00 |
|
Ishaan Jaff
|
d589d8e4ac
|
fix using pass_through_all_models
|
2024-07-25 19:32:49 -07:00 |
|
Ishaan Jaff
|
a46c463dee
|
router support setting pass_through_all_models
|
2024-07-25 18:34:12 -07:00 |
|
Krrish Dholakia
|
84ef8c11ff
|
feat(proxy_server.py): support custom llm handler on proxy
|
2024-07-25 17:56:34 -07:00 |
|