Commit graph

11603 commits

Author SHA1 Message Date
Krrish Dholakia
5c6a382d3b refactor(main.py): trigger new build 2024-05-09 15:41:33 -07:00
Krrish Dholakia
acb615957d fix(utils.py): change error log to be debug 2024-05-09 13:58:45 -07:00
Krrish Dholakia
c4295e1667 test(test_least_busy_routing.py): avoid deployments with low rate limits 2024-05-09 13:54:24 -07:00
Krrish Dholakia
927d36148f feat(proxy_server.py): expose new /team/list endpoint
Closes https://github.com/BerriAI/litellm/issues/3523
2024-05-09 13:21:00 -07:00
Krrish Dholakia
e3f25a4a1f fix(auth_checks.py): fix 'get_end_user_object'
await cache get
2024-05-09 13:05:56 -07:00
Ishaan Jaff
bf99311f5c fix load test length 2024-05-09 12:50:24 -07:00
Ishaan Jaff
c8662234c7 fix docker run command on release notes 2024-05-09 12:38:38 -07:00
Ishaan Jaff
50b4167a27 fix interpret load test 2024-05-09 12:30:35 -07:00
Ishaan Jaff
30b7e9f776 temp fix laod test 2024-05-09 12:28:14 -07:00
Ishaan Jaff
84b2af8e6c fix show docker run on repo 2024-05-09 12:27:44 -07:00
Ishaan Jaff
6634ea37e9 fix TextCompletionStreamWrapper 2024-05-09 09:54:44 -07:00
Ishaan Jaff
e0b1eff1eb feat - support stream_options for text completion 2024-05-09 08:42:25 -07:00
Ishaan Jaff
a29fcc057b test - stream_options on OpenAI text_completion 2024-05-09 08:41:31 -07:00
Ishaan Jaff
66053f14ae stream_options for text-completionopenai 2024-05-09 08:37:40 -07:00
Ishaan Jaff
4d5b4a5293 add stream_options to text_completion 2024-05-09 08:35:35 -07:00
Ishaan Jaff
0b1885ca99
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
2024-05-09 08:34:08 -07:00
Krrish Dholakia
4cfd988529 fix(get_api_base): fix get_api_base to handle model with alias 2024-05-09 08:01:17 -07:00
Ishaan Jaff
dfd6361310 fix completion vs acompletion params 2024-05-09 07:59:37 -07:00
Krish Dholakia
4c8787f896
Merge pull request #3546 from BerriAI/revert-3479-feature/watsonx-integration
Revert "Add support for async streaming to watsonx provider "
2024-05-09 07:44:32 -07:00
Krish Dholakia
8015bc1c47
Revert "Add support for async streaming to watsonx provider " 2024-05-09 07:44:15 -07:00
Merlinvt
265d777894 fixes 2 2024-05-09 15:27:14 +02:00
Merlinvt
ccdd2046af fixes 2024-05-09 15:20:32 +02:00
Merlinvt
fc51a3631e add additional models from openrouter 2024-05-09 15:16:34 +02:00
Krish Dholakia
66a1b581e5
Merge pull request #3536 from BerriAI/litellm_region_based_routing
feat(proxy_server.py): add CRUD endpoints for 'end_user' management
2024-05-08 22:23:40 -07:00
Krish Dholakia
8ad979cdfe
Merge branch 'main' into litellm_region_based_routing 2024-05-08 22:19:51 -07:00
Krish Dholakia
3f13251241
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Krrish Dholakia
3d18897d69 feat(router.py): enable filtering model group by 'allowed_model_region' 2024-05-08 22:10:17 -07:00
Ishaan Jaff
e7e54772ae docs include stream_options param 2024-05-08 21:57:25 -07:00
Ishaan Jaff
80ca011a64 support stream_options 2024-05-08 21:53:33 -07:00
Ishaan Jaff
f2965660dd test openai stream_options 2024-05-08 21:52:39 -07:00
Ishaan Jaff
1042051602 support stream_options for chat completion models 2024-05-08 21:52:25 -07:00
Ishaan Jaff
edb10198ef feat - add stream_options support litellm 2024-05-08 21:25:40 -07:00
Ishaan Jaff
b5db045624 bump: version 1.36.3 → 1.36.4 2024-05-08 19:49:35 -07:00
Krrish Dholakia
db666b01e5 feat(proxy_server.py): add CRUD endpoints for 'end_user' management
allow admin to specify region + default models for end users
2024-05-08 18:50:36 -07:00
Ishaan Jaff
dea4a081c7 ui - new build 2024-05-08 18:45:54 -07:00
Ishaan Jaff
b15c1c907a
Merge pull request #3530 from BerriAI/ui_show_spend_end_user
[UI] show `End-User` Usage on Usage Tab
2024-05-08 18:36:51 -07:00
Ishaan Jaff
6d955ef457
Merge branch 'main' into ui_show_spend_end_user 2024-05-08 18:29:25 -07:00
Ishaan Jaff
bcba92e092 fix bug filtering keys on usage tab 2024-05-08 18:25:28 -07:00
Ishaan Jaff
a38d9e35fd feat - get price by end_user 2024-05-08 18:19:27 -07:00
Ishaan Jaff
0260addeac clean up usage tab 2024-05-08 18:06:31 -07:00
Ishaan Jaff
f5f5bebe49 clean up users page 2024-05-08 17:58:10 -07:00
Ishaan Jaff
644738dafb usage - filter by keys 2024-05-08 17:57:28 -07:00
Ishaan Jaff
081e8732dc ui - cleanup view users tab 2024-05-08 17:34:44 -07:00
Ishaan Jaff
08d37e1b0e ui - filter usage by day 2024-05-08 17:34:30 -07:00
Ishaan Jaff
c7037c20ea fix pass startTime and endTime on Users Usage tab 2024-05-08 17:26:46 -07:00
Ishaan Jaff
b99a6717af fix - startTime, endTime in GlobalEndUsersSpend 2024-05-08 17:05:09 -07:00
Ishaan Jaff
1eea4d1c90 fix /global/spend/end_users 2024-05-08 17:03:38 -07:00
Ishaan Jaff
faab704d28 update global/spend/end_users 2024-05-08 17:03:09 -07:00
Ishaan Jaff
41a4a06389
Merge pull request #3534 from BerriAI/litellm_fix_cost_calc_bedrock
[Fix] `litellm.completion_cost(model="bedrock/anthropic.claude-instant-v1"..)`
2024-05-08 16:59:46 -07:00
Krish Dholakia
ce7e64609f
Merge pull request #3535 from BerriAI/litellm_version_response_headers
feat(proxy_server.py): return litellm version in response headers
2024-05-08 16:34:44 -07:00