Ishaan Jaff
|
c8662234c7
|
fix docker run command on release notes
|
2024-05-09 12:38:38 -07:00 |
|
Ishaan Jaff
|
50b4167a27
|
fix interpret load test
|
2024-05-09 12:30:35 -07:00 |
|
Ishaan Jaff
|
30b7e9f776
|
temp fix laod test
|
2024-05-09 12:28:14 -07:00 |
|
Ishaan Jaff
|
84b2af8e6c
|
fix show docker run on repo
|
2024-05-09 12:27:44 -07:00 |
|
Ishaan Jaff
|
0b1885ca99
|
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
|
2024-05-09 08:34:08 -07:00 |
|
Krrish Dholakia
|
4cfd988529
|
fix(get_api_base): fix get_api_base to handle model with alias
|
2024-05-09 08:01:17 -07:00 |
|
Ishaan Jaff
|
dfd6361310
|
fix completion vs acompletion params
|
2024-05-09 07:59:37 -07:00 |
|
Krish Dholakia
|
4c8787f896
|
Merge pull request #3546 from BerriAI/revert-3479-feature/watsonx-integration
Revert "Add support for async streaming to watsonx provider "
|
2024-05-09 07:44:32 -07:00 |
|
Krish Dholakia
|
8015bc1c47
|
Revert "Add support for async streaming to watsonx provider "
|
2024-05-09 07:44:15 -07:00 |
|
Krish Dholakia
|
66a1b581e5
|
Merge pull request #3536 from BerriAI/litellm_region_based_routing
feat(proxy_server.py): add CRUD endpoints for 'end_user' management
|
2024-05-08 22:23:40 -07:00 |
|
Krish Dholakia
|
8ad979cdfe
|
Merge branch 'main' into litellm_region_based_routing
|
2024-05-08 22:19:51 -07:00 |
|
Krish Dholakia
|
3f13251241
|
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
|
2024-05-08 22:19:05 -07:00 |
|
Krrish Dholakia
|
3d18897d69
|
feat(router.py): enable filtering model group by 'allowed_model_region'
|
2024-05-08 22:10:17 -07:00 |
|
Ishaan Jaff
|
e7e54772ae
|
docs include stream_options param
|
2024-05-08 21:57:25 -07:00 |
|
Ishaan Jaff
|
80ca011a64
|
support stream_options
|
2024-05-08 21:53:33 -07:00 |
|
Ishaan Jaff
|
f2965660dd
|
test openai stream_options
|
2024-05-08 21:52:39 -07:00 |
|
Ishaan Jaff
|
1042051602
|
support stream_options for chat completion models
|
2024-05-08 21:52:25 -07:00 |
|
Ishaan Jaff
|
edb10198ef
|
feat - add stream_options support litellm
|
2024-05-08 21:25:40 -07:00 |
|
Ishaan Jaff
|
b5db045624
|
bump: version 1.36.3 → 1.36.4
|
2024-05-08 19:49:35 -07:00 |
|
Krrish Dholakia
|
db666b01e5
|
feat(proxy_server.py): add CRUD endpoints for 'end_user' management
allow admin to specify region + default models for end users
|
2024-05-08 18:50:36 -07:00 |
|
Ishaan Jaff
|
dea4a081c7
|
ui - new build
|
2024-05-08 18:45:54 -07:00 |
|
Ishaan Jaff
|
b15c1c907a
|
Merge pull request #3530 from BerriAI/ui_show_spend_end_user
[UI] show `End-User` Usage on Usage Tab
|
2024-05-08 18:36:51 -07:00 |
|
Ishaan Jaff
|
6d955ef457
|
Merge branch 'main' into ui_show_spend_end_user
|
2024-05-08 18:29:25 -07:00 |
|
Ishaan Jaff
|
bcba92e092
|
fix bug filtering keys on usage tab
|
2024-05-08 18:25:28 -07:00 |
|
Ishaan Jaff
|
a38d9e35fd
|
feat - get price by end_user
|
2024-05-08 18:19:27 -07:00 |
|
Ishaan Jaff
|
0260addeac
|
clean up usage tab
|
2024-05-08 18:06:31 -07:00 |
|
Ishaan Jaff
|
f5f5bebe49
|
clean up users page
|
2024-05-08 17:58:10 -07:00 |
|
Ishaan Jaff
|
644738dafb
|
usage - filter by keys
|
2024-05-08 17:57:28 -07:00 |
|
Ishaan Jaff
|
081e8732dc
|
ui - cleanup view users tab
|
2024-05-08 17:34:44 -07:00 |
|
Ishaan Jaff
|
08d37e1b0e
|
ui - filter usage by day
|
2024-05-08 17:34:30 -07:00 |
|
Ishaan Jaff
|
c7037c20ea
|
fix pass startTime and endTime on Users Usage tab
|
2024-05-08 17:26:46 -07:00 |
|
Ishaan Jaff
|
b99a6717af
|
fix - startTime, endTime in GlobalEndUsersSpend
|
2024-05-08 17:05:09 -07:00 |
|
Ishaan Jaff
|
1eea4d1c90
|
fix /global/spend/end_users
|
2024-05-08 17:03:38 -07:00 |
|
Ishaan Jaff
|
faab704d28
|
update global/spend/end_users
|
2024-05-08 17:03:09 -07:00 |
|
Ishaan Jaff
|
41a4a06389
|
Merge pull request #3534 from BerriAI/litellm_fix_cost_calc_bedrock
[Fix] `litellm.completion_cost(model="bedrock/anthropic.claude-instant-v1"..)`
|
2024-05-08 16:59:46 -07:00 |
|
Krish Dholakia
|
ce7e64609f
|
Merge pull request #3535 from BerriAI/litellm_version_response_headers
feat(proxy_server.py): return litellm version in response headers
|
2024-05-08 16:34:44 -07:00 |
|
Krrish Dholakia
|
6575143460
|
feat(proxy_server.py): return litellm version in response headers
|
2024-05-08 16:00:08 -07:00 |
|
Ishaan Jaff
|
33d6caa889
|
fix completion cost test
|
2024-05-08 15:51:30 -07:00 |
|
Ishaan Jaff
|
282b8d0ae4
|
test bedrock pricing
|
2024-05-08 15:26:53 -07:00 |
|
Ishaan Jaff
|
8348c671a9
|
fix - cost tracking - looking up bedrock pricing
|
2024-05-08 15:25:52 -07:00 |
|
Krrish Dholakia
|
80378966a0
|
build: add azure resource template
|
2024-05-08 15:24:58 -07:00 |
|
Krish Dholakia
|
91bb7cd261
|
Merge pull request #3437 from msabramo/add-engines-model-chat-completions-endpoint
Add `/engines/{model}/chat/completions` endpoint
|
2024-05-08 14:30:39 -07:00 |
|
Ishaan Jaff
|
ba08a82885
|
Merge pull request #3532 from BerriAI/litellm_send_alert_on_cooling_down_deploymeny
[Feat] send alert on cooling down deployment
|
2024-05-08 14:30:31 -07:00 |
|
Ishaan Jaff
|
597b09598c
|
feat - send alert on cooling down a deploymeny
|
2024-05-08 14:14:14 -07:00 |
|
Ishaan Jaff
|
aef3d89f0c
|
fix add cooldown_deployment alert_type
|
2024-05-08 14:13:51 -07:00 |
|
Krrish Dholakia
|
51b6c3bdbc
|
test(test_function_call_parsing.py): add test for function call parsing
Closes https://github.com/BerriAI/litellm/issues/2654
|
2024-05-08 10:54:26 -07:00 |
|
Ishaan Jaff
|
c60f12a70b
|
ui - show guardrails
|
2024-05-08 10:51:34 -07:00 |
|
Ishaan Jaff
|
7bfd02350a
|
stash - users usage tab
|
2024-05-08 10:38:27 -07:00 |
|
Krrish Dholakia
|
c5897543c8
|
docs(hosted.md): add feature list
|
2024-05-08 09:53:13 -07:00 |
|
Ishaan Jaff
|
cae390b51a
|
ui - clean up usage tab
|
2024-05-08 09:51:58 -07:00 |
|