Commit graph

2077 commits

Author SHA1 Message Date
Ishaan Jaff
ebe842d404 log startTime and EndTime for exceptions 2024-04-30 13:34:14 -07:00
Ishaan Jaff
ab2b78cd88 feat log request kwargs in error logs 2024-04-30 13:28:26 -07:00
Ishaan Jaff
918fff424f fix log model_group 2024-04-30 13:11:09 -07:00
Ishaan Jaff
7f9b9d0173 fix - log api_base in errors 2024-04-30 13:02:42 -07:00
Ishaan Jaff
96cbdcb349 fix - working exception writing 2024-04-30 12:48:17 -07:00
Ishaan Jaff
cb44c0ac44 fix types for errorLog 2024-04-30 12:31:33 -07:00
Ishaan Jaff
75ca067cc7 add LiteLLM_ErrorLogs to types 2024-04-30 12:16:03 -07:00
Krrish Dholakia
5b56a0856e docs(load_test.md): load test multiple instances of the proxy w/ tpm/rpm limits on deployments 2024-04-29 15:58:14 -07:00
Krrish Dholakia
a3fea74826 fix(proxy_server.py): return more detailed auth error message. 2024-04-29 07:24:19 -07:00
CyanideByte
5070b218ec Merge branch 'BerriAI:main' into main 2024-04-27 20:51:33 -07:00
CyanideByte
28090d2ae3 Added _types.py cases from edwinjosegeorge PR#3340 2024-04-27 20:42:54 -07:00
Ishaan Jaff
9850a81879 ui - new build 2024-04-27 17:28:30 -07:00
Krrish Dholakia
d1e58193d0 test: replace flaky endpoint 2024-04-27 16:37:09 -07:00
Ishaan Jaff
cb467182a3 fix - only run global_proxy_spend on chat completion calls 2024-04-27 14:11:00 -07:00
Krish Dholakia
26bacef87b Merge branch 'main' into litellm_default_router_retries 2024-04-27 11:21:57 -07:00
Krrish Dholakia
069d1f863d fix(router.py): add /v1/ if missing to base url, for openai-compatible api's
Fixes https://github.com/BerriAI/litellm/issues/2279
2024-04-26 17:05:07 -07:00
Krish Dholakia
37f2045181 Merge branch 'main' into litellm_default_router_retries 2024-04-26 14:52:24 -07:00
Krrish Dholakia
ae04330348 fix(proxy_server.py): fix setting offset-aware datetime 2024-04-25 21:18:32 -07:00
Ishaan Jaff
db1bfe5cc5 ui -new build 2024-04-25 20:33:02 -07:00
Krish Dholakia
f9814105f7 Merge pull request #3310 from BerriAI/litellm_langfuse_error_logging_2
fix(proxy/utils.py): log rejected proxy requests to langfuse
2024-04-25 19:49:59 -07:00
Krrish Dholakia
462724feef fix(proxy/utils.py): log rejected proxy requests to langfuse 2024-04-25 19:26:27 -07:00
Krish Dholakia
33393f0029 Merge pull request #3308 from BerriAI/litellm_fix_streaming_n
fix(utils.py): fix the response object returned when n>1 for stream=true
2024-04-25 18:36:54 -07:00
Ishaan Jaff
7411d6d495 ui - new build 2024-04-25 16:39:05 -07:00
Ishaan Jaff
9f0284affa fixes for testing alerting 2024-04-25 16:33:55 -07:00
Ishaan Jaff
cf9bb928f5 fix add alert_to_webhook_url to general settings 2024-04-25 15:43:54 -07:00
Ishaan Jaff
a6132ab5d4 backend - update slack alert_to_webhook_url_map 2024-04-25 13:47:52 -07:00
Krrish Dholakia
157dd819f6 fix(utils.py): fix the response object returned when n>1 for stream=true
Fixes https://github.com/BerriAI/litellm/issues/3276
2024-04-25 13:27:29 -07:00
Ishaan Jaff
2190b039fd pass alert type on alerting handle 2024-04-25 13:05:34 -07:00
Krrish Dholakia
f1b2405fe0 fix(router.py): fix default retry logic 2024-04-25 11:57:27 -07:00
Krrish Dholakia
cf24e3eb02 fix(factory.py): add replicate meta llama prompt templating support 2024-04-25 08:25:00 -07:00
Krrish Dholakia
f260050e4d fix(proxy_server.py): fix update router 2024-04-24 23:01:21 -07:00
Krrish Dholakia
9577548060 fix(proxy_server.py): fix updating non-router settings for proxy config 2024-04-24 22:50:04 -07:00
Krish Dholakia
265f16456f Merge pull request #3267 from BerriAI/litellm_openai_streaming_fix
fix(utils.py): fix streaming to not return usage dict
2024-04-24 21:08:33 -07:00
Ishaan Jaff
42863765a1 ui - new build 2024-04-24 21:04:44 -07:00
Ishaan Jaff
98033ff75d ui - new build 2024-04-24 20:40:01 -07:00
Krrish Dholakia
5f862819e7 fix(utils.py): fix setattr error 2024-04-24 20:19:27 -07:00
Krrish Dholakia
1988ce3247 feat(router.py): support mock testing fallbacks flag
allow user to test if fallbacks work as expected with a `mock_testing_fallbacks = True` flag set during a call
2024-04-24 20:13:10 -07:00
Krrish Dholakia
f4bd85a489 fix(proxy_server.py): fix /config/update/
allows updating router config via UI and having the change be propogated across all proxy instances by persisting config changes to the db
2024-04-24 16:42:42 -07:00
Ishaan Jaff
f7053ac85f Merge pull request #3277 from BerriAI/litellm_update_deployments
[UI] V0 - Edit Model tpm, rpm, api_base
2024-04-24 14:03:00 -07:00
Krrish Dholakia
cbcb3318f4 fix(langfuse.py): fix langfuse environ check 2024-04-24 13:21:00 -07:00
Krrish Dholakia
c3d9c7a282 bump: version 1.35.23 → 1.35.24 2024-04-24 11:13:37 -07:00
Krrish Dholakia
f93d859569 fix(ui): fix add slack/langfuse logging on ui 2024-04-24 10:58:44 -07:00
Ishaan Jaff
c19f28b7d2 /model/update endpoint 2024-04-24 10:39:20 -07:00
Krrish Dholakia
840bb5a18d fix(proxy_server.py): add new flag for disable sharing master key on ui 2024-04-24 10:06:01 -07:00
Krrish Dholakia
cd9f7b44e5 bump: version 1.35.22 → 1.35.23 2024-04-24 08:16:33 -07:00
Krrish Dholakia
6b938ff14d build(ui): add ui updates to main build 2024-04-23 22:53:37 -07:00
Ishaan Jaff
6baef1a578 ui - new build 2024-04-23 22:23:49 -07:00
Ishaan Jaff
7a76a8f21d Merge pull request #3259 from BerriAI/litellm_filter_metrics_model_group
UI - filter metrics by model group
2024-04-23 22:22:05 -07:00
Ishaan Jaff
acf08e6d00 backend allow filtering by model_group 2024-04-23 22:03:20 -07:00
Krrish Dholakia
5da7946d43 fix(langsmith.py): fix streaming langsmith calls 2024-04-23 20:47:19 -07:00