Krish Dholakia
|
86d0c0ae4e
|
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
feat(router.py): allow setting model_region in litellm_params
|
2024-05-11 11:36:22 -07:00 |
|
Krrish Dholakia
|
6714854bb7
|
feat(router.py): support region routing for bedrock, vertex ai, watsonx
|
2024-05-11 11:04:00 -07:00 |
|
Krrish Dholakia
|
ebc927f1c8
|
feat(router.py): allow setting model_region in litellm_params
Closes https://github.com/BerriAI/litellm/issues/3580
|
2024-05-11 10:18:08 -07:00 |
|
Krish Dholakia
|
d33e49411d
|
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
(fix) Fixed linting and other bugs with watsonx provider
|
2024-05-11 09:56:02 -07:00 |
|
Krish Dholakia
|
8f6ae9a059
|
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
|
2024-05-11 09:31:46 -07:00 |
|
Krish Dholakia
|
40063798bd
|
Merge pull request #3507 from Manouchehri/oidc-3505-part-1
Initial OIDC support (Google/GitHub/CircleCI -> Amazon Bedrock & Azure OpenAI)
|
2024-05-11 09:25:17 -07:00 |
|
Ishaan Jaff
|
b09075da53
|
Merge pull request #3577 from BerriAI/litellm_add_triton_server
[Feat] Add Triton Embeddings to LiteLLM
|
2024-05-10 19:20:23 -07:00 |
|
Ishaan Jaff
|
ed2c05d10d
|
fix triton params
|
2024-05-10 19:14:48 -07:00 |
|
Krish Dholakia
|
1aa567f3b5
|
Merge pull request #3571 from BerriAI/litellm_hf_classifier_support
Huggingface classifier support
|
2024-05-10 17:54:27 -07:00 |
|
Ishaan Jaff
|
1d25be0ca8
|
fix langfuse logger re-initialized on all failure callbacks
|
2024-05-10 17:48:44 -07:00 |
|
Ishaan Jaff
|
ce8523808b
|
fix langfuse failure logging
|
2024-05-10 17:02:38 -07:00 |
|
Ishaan Jaff
|
53f9d8280f
|
fix - support dynamic failure callbacks
|
2024-05-10 16:37:01 -07:00 |
|
Ishaan Jaff
|
b6e0f00ed8
|
fix - using failure callbacks with team based logging
|
2024-05-10 16:18:13 -07:00 |
|
Krrish Dholakia
|
c17f221b89
|
test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi
|
2024-05-10 14:07:01 -07:00 |
|
Krrish Dholakia
|
9a31f3d3d9
|
fix(main.py): support env var 'VERTEX_PROJECT' and 'VERTEX_LOCATION'
|
2024-05-10 07:57:56 -07:00 |
|
Simon Sanchez Viloria
|
e1372de9ee
|
Merge branch 'main' into feature/watsonx-integration
|
2024-05-10 12:09:09 +02:00 |
|
Simon Sanchez Viloria
|
170fd11c82
|
(fix) watsonx.py: Fixed linting errors and make sure stream chunk always return usage
|
2024-05-10 11:53:33 +02:00 |
|
Krish Dholakia
|
a671046b45
|
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
|
2024-05-09 22:21:16 -07:00 |
|
Ishaan Jaff
|
5eb12e30cc
|
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
[Feat] support `stream_options` on `litellm.text_completion`
|
2024-05-09 18:05:58 -07:00 |
|
Krrish Dholakia
|
d7189c21fd
|
feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
|
2024-05-09 17:41:27 -07:00 |
|
Krrish Dholakia
|
186c0ec77b
|
feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
|
2024-05-09 16:39:43 -07:00 |
|
Krrish Dholakia
|
acb615957d
|
fix(utils.py): change error log to be debug
|
2024-05-09 13:58:45 -07:00 |
|
Ishaan Jaff
|
6634ea37e9
|
fix TextCompletionStreamWrapper
|
2024-05-09 09:54:44 -07:00 |
|
Ishaan Jaff
|
e0b1eff1eb
|
feat - support stream_options for text completion
|
2024-05-09 08:42:25 -07:00 |
|
Ishaan Jaff
|
0b1885ca99
|
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
|
2024-05-09 08:34:08 -07:00 |
|
Krrish Dholakia
|
4cfd988529
|
fix(get_api_base): fix get_api_base to handle model with alias
|
2024-05-09 08:01:17 -07:00 |
|
Krish Dholakia
|
8015bc1c47
|
Revert "Add support for async streaming to watsonx provider "
|
2024-05-09 07:44:15 -07:00 |
|
Krish Dholakia
|
8ad979cdfe
|
Merge branch 'main' into litellm_region_based_routing
|
2024-05-08 22:19:51 -07:00 |
|
Krish Dholakia
|
3f13251241
|
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
|
2024-05-08 22:19:05 -07:00 |
|
Krrish Dholakia
|
3d18897d69
|
feat(router.py): enable filtering model group by 'allowed_model_region'
|
2024-05-08 22:10:17 -07:00 |
|
Ishaan Jaff
|
80ca011a64
|
support stream_options
|
2024-05-08 21:53:33 -07:00 |
|
Ishaan Jaff
|
41a4a06389
|
Merge pull request #3534 from BerriAI/litellm_fix_cost_calc_bedrock
[Fix] `litellm.completion_cost(model="bedrock/anthropic.claude-instant-v1"..)`
|
2024-05-08 16:59:46 -07:00 |
|
Krrish Dholakia
|
6575143460
|
feat(proxy_server.py): return litellm version in response headers
|
2024-05-08 16:00:08 -07:00 |
|
Ishaan Jaff
|
33d6caa889
|
fix completion cost test
|
2024-05-08 15:51:30 -07:00 |
|
Ishaan Jaff
|
8348c671a9
|
fix - cost tracking - looking up bedrock pricing
|
2024-05-08 15:25:52 -07:00 |
|
Ishaan Jaff
|
4e7b5aa9d7
|
Merge pull request #3439 from phact/patch-3
add_function_to_prompt bug fix
|
2024-05-07 19:31:19 -07:00 |
|
David Manouchehri
|
44b1b21911
|
feat(utils.py) - Add OIDC caching for Google Cloud Run and GitHub Actions.
|
2024-05-07 21:24:55 +00:00 |
|
phact
|
4c64e3da10
|
locals().copy()
|
2024-05-07 14:58:35 -04:00 |
|
Paul Gauthier
|
90eb0ea022
|
Added support for the deepseek api
|
2024-05-07 11:44:03 -07:00 |
|
phact
|
7c5c9a8152
|
looks like cohere does support function calling
|
2024-05-07 13:41:05 -04:00 |
|
phact
|
1b811cd152
|
unit test and list fix
|
2024-05-07 13:24:28 -04:00 |
|
David Manouchehri
|
4b655d8b33
|
feat(util.py): Add OIDC support.
|
2024-05-07 15:46:48 +00:00 |
|
Krish Dholakia
|
30003afbf8
|
Merge pull request #3459 from alexanderepstein/langfuse_improvements
Update support for langfuse metadata
|
2024-05-06 21:56:29 -07:00 |
|
Krish Dholakia
|
aa62d891a0
|
Merge branch 'main' into litellm_slack_daily_reports
|
2024-05-06 19:31:20 -07:00 |
|
Krrish Dholakia
|
718f423d7d
|
feat(slack_alerting.py): support sending daily reports on deployments
allow admin to easily know slow + failing deployments
Closes https://github.com/BerriAI/litellm/issues/3483
|
2024-05-06 17:18:42 -07:00 |
|
Ishaan Jaff
|
1de50d62f7
|
fix test router debug logs
|
2024-05-06 16:38:16 -07:00 |
|
Ishaan Jaff
|
6ff37aabb0
|
fix add key name + team name in alerting messages
|
2024-05-06 14:29:04 -07:00 |
|
Krrish Dholakia
|
4b5cf26c1b
|
fix(utils.py): handle gemini chunk no parts error
Fixes https://github.com/BerriAI/litellm/issues/3468
|
2024-05-06 10:59:53 -07:00 |
|
Simon Sanchez Viloria
|
6181d1eaad
|
Merge branch 'main' into feature/watsonx-integration
|
2024-05-06 17:27:14 +02:00 |
|
Simon Sanchez Viloria
|
83a274b54b
|
(feat) support for async stream to watsonx provider
|
2024-05-06 17:08:40 +02:00 |
|