Commit graph

11363 commits

Author SHA1 Message Date
Krrish Dholakia
714370956f fix(predibase.py): fix async streaming 2024-05-09 22:18:16 -07:00
Krrish Dholakia
76d4290591 fix(predibase.py): fix event loop closed error 2024-05-09 19:07:19 -07:00
Krrish Dholakia
491e177348 fix(predibase.py): fix async completion call 2024-05-09 18:44:19 -07:00
Krrish Dholakia
5a38438c3f docs(customer_routing.md): add region-based routing for specific customers, to docs 2024-05-09 18:40:49 -07:00
Krrish Dholakia
425efc60f4 fix(main.py): fix linting error 2024-05-09 18:12:28 -07:00
Ishaan Jaff
5eb12e30cc
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
[Feat] support `stream_options` on `litellm.text_completion`
2024-05-09 18:05:58 -07:00
Ishaan Jaff
63bfc12a63
Merge pull request #3555 from CyanideByte/global-ignore-warning
Globally filtering pydantic conflict warnings
2024-05-09 18:04:08 -07:00
Krrish Dholakia
9083d8e490 fix: fix linting errors 2024-05-09 17:55:27 -07:00
CyanideByte
4a7be9163b Globally filtering pydantic conflict warnings 2024-05-09 17:42:19 -07:00
Krrish Dholakia
d7189c21fd feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
2024-05-09 17:41:27 -07:00
Krish Dholakia
dab176b7e7
Merge pull request #3551 from powerhouseofthecell/fix/error-on-get-user-role
Fix/error on get user role
2024-05-09 17:40:18 -07:00
Krrish Dholakia
186c0ec77b feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Nick Wong
d3a228d03b
added changes from upstream
Merge branch 'main' into fix/error-on-get-user-role
2024-05-09 16:14:14 -07:00
Nick Wong
c42f1ce2c6
removed extra default dict return, which causes error if user_role is a string 2024-05-09 16:13:26 -07:00
Krrish Dholakia
43b2050cc2 bump: version 1.36.4 → 1.37.0 2024-05-09 15:41:40 -07:00
Krrish Dholakia
5c6a382d3b refactor(main.py): trigger new build 2024-05-09 15:41:33 -07:00
Krrish Dholakia
acb615957d fix(utils.py): change error log to be debug 2024-05-09 13:58:45 -07:00
Krrish Dholakia
c4295e1667 test(test_least_busy_routing.py): avoid deployments with low rate limits 2024-05-09 13:54:24 -07:00
Krrish Dholakia
927d36148f feat(proxy_server.py): expose new /team/list endpoint
Closes https://github.com/BerriAI/litellm/issues/3523
2024-05-09 13:21:00 -07:00
Krrish Dholakia
e3f25a4a1f fix(auth_checks.py): fix 'get_end_user_object'
await cache get
2024-05-09 13:05:56 -07:00
Ishaan Jaff
bf99311f5c fix load test length 2024-05-09 12:50:24 -07:00
Ishaan Jaff
c8662234c7 fix docker run command on release notes 2024-05-09 12:38:38 -07:00
Ishaan Jaff
50b4167a27 fix interpret load test 2024-05-09 12:30:35 -07:00
Ishaan Jaff
30b7e9f776 temp fix laod test 2024-05-09 12:28:14 -07:00
Ishaan Jaff
84b2af8e6c fix show docker run on repo 2024-05-09 12:27:44 -07:00
Ishaan Jaff
6634ea37e9 fix TextCompletionStreamWrapper 2024-05-09 09:54:44 -07:00
Ishaan Jaff
e0b1eff1eb feat - support stream_options for text completion 2024-05-09 08:42:25 -07:00
Ishaan Jaff
a29fcc057b test - stream_options on OpenAI text_completion 2024-05-09 08:41:31 -07:00
Ishaan Jaff
66053f14ae stream_options for text-completionopenai 2024-05-09 08:37:40 -07:00
Ishaan Jaff
4d5b4a5293 add stream_options to text_completion 2024-05-09 08:35:35 -07:00
Ishaan Jaff
0b1885ca99
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
2024-05-09 08:34:08 -07:00
Krrish Dholakia
4cfd988529 fix(get_api_base): fix get_api_base to handle model with alias 2024-05-09 08:01:17 -07:00
Ishaan Jaff
dfd6361310 fix completion vs acompletion params 2024-05-09 07:59:37 -07:00
Krish Dholakia
4c8787f896
Merge pull request #3546 from BerriAI/revert-3479-feature/watsonx-integration
Revert "Add support for async streaming to watsonx provider "
2024-05-09 07:44:32 -07:00
Krish Dholakia
8015bc1c47
Revert "Add support for async streaming to watsonx provider " 2024-05-09 07:44:15 -07:00
Merlinvt
265d777894 fixes 2 2024-05-09 15:27:14 +02:00
Merlinvt
ccdd2046af fixes 2024-05-09 15:20:32 +02:00
Merlinvt
fc51a3631e add additional models from openrouter 2024-05-09 15:16:34 +02:00
Krish Dholakia
66a1b581e5
Merge pull request #3536 from BerriAI/litellm_region_based_routing
feat(proxy_server.py): add CRUD endpoints for 'end_user' management
2024-05-08 22:23:40 -07:00
Krish Dholakia
8ad979cdfe
Merge branch 'main' into litellm_region_based_routing 2024-05-08 22:19:51 -07:00
Krish Dholakia
3f13251241
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Krrish Dholakia
3d18897d69 feat(router.py): enable filtering model group by 'allowed_model_region' 2024-05-08 22:10:17 -07:00
Ishaan Jaff
e7e54772ae docs include stream_options param 2024-05-08 21:57:25 -07:00
Ishaan Jaff
80ca011a64 support stream_options 2024-05-08 21:53:33 -07:00
Ishaan Jaff
f2965660dd test openai stream_options 2024-05-08 21:52:39 -07:00
Ishaan Jaff
1042051602 support stream_options for chat completion models 2024-05-08 21:52:25 -07:00
Ishaan Jaff
edb10198ef feat - add stream_options support litellm 2024-05-08 21:25:40 -07:00
Ishaan Jaff
b5db045624 bump: version 1.36.3 → 1.36.4 2024-05-08 19:49:35 -07:00
Krrish Dholakia
db666b01e5 feat(proxy_server.py): add CRUD endpoints for 'end_user' management
allow admin to specify region + default models for end users
2024-05-08 18:50:36 -07:00
Ishaan Jaff
dea4a081c7 ui - new build 2024-05-08 18:45:54 -07:00