Krish Dholakia
3645c89fb5
Merge pull request #3602 from msabramo/msabramo/fix_pkg_resources_warning
...
Fix `pkg_resources` warning
2024-05-13 21:59:52 -07:00
Krrish Dholakia
f8e1b1db2e
refactor(utils.py): trigger local_testing
2024-05-13 18:18:22 -07:00
Krrish Dholakia
ace5ce0b78
fix(utils.py): fix watsonx exception mapping
2024-05-13 18:13:13 -07:00
Krrish Dholakia
bf8d3be791
fix(utils.py): watsonx ai exception mapping fix
2024-05-13 17:11:33 -07:00
Krrish Dholakia
ca641d0a24
fix(utils.py): handle api assistant returning 'null' role
...
Fixes https://github.com/BerriAI/litellm/issues/3621
2024-05-13 16:46:07 -07:00
Krrish Dholakia
8d94665842
fix(utils.py): fix custom pricing when litellm model != response obj model name
2024-05-13 15:25:35 -07:00
Krrish Dholakia
96336cdd49
fix(openai.py): creat MistralConfig with response_format mapping for mistral api
2024-05-13 13:29:58 -07:00
Krrish Dholakia
a907247033
fix(utils.py): fix vertex ai function calling + streaming
...
Completes https://github.com/BerriAI/litellm/issues/3147
2024-05-13 12:32:39 -07:00
Marc Abramowitz
09829c7c78
Fix pkg_resources warning
...
by trying to use `importlib.resources` first and falling back to
`pkg_resources` if that fails.
With this and the changes in GH-3600 and GH-3601, the tests pass with **zero
warnings**!! 🎉 🎉
```shell
abramowi at marcs-mbp-3 in ~/Code/OpenSource/litellm (msabramo/fix-pydantic-warnings●●)
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py
====================================== test session starts ======================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, mock-3.14.0
collected 12 items
litellm/tests/test_proxy_server.py s..........s [100%]
================================= 10 passed, 2 skipped in 9.24s =================================
```
2024-05-12 12:46:24 -07:00
Krish Dholakia
784ae85ba0
Merge branch 'main' into litellm_bedrock_command_r_support
2024-05-11 21:24:42 -07:00
Krrish Dholakia
793d6c1dc1
fix(utils.py): correctly exception map 'request too large' as rate limit error
2024-05-11 20:20:34 -07:00
Krrish Dholakia
68596ced04
feat(bedrock_httpx.py): working bedrock command-r sync+async streaming
2024-05-11 19:39:51 -07:00
Ishaan Jaff
e2665c4e72
fix - oidc provider on python3.8
2024-05-11 16:01:34 -07:00
Ishaan Jaff
9cc30e32b3
(Fix) - linting errors
2024-05-11 15:57:06 -07:00
Krrish Dholakia
926b86af87
feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls
2024-05-11 13:43:08 -07:00
Krish Dholakia
7f64c61275
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
...
feat(router.py): allow setting model_region in litellm_params
2024-05-11 11:36:22 -07:00
Krrish Dholakia
691c185ff8
feat(router.py): support region routing for bedrock, vertex ai, watsonx
2024-05-11 11:04:00 -07:00
Krrish Dholakia
2ed155b4d4
feat(router.py): allow setting model_region in litellm_params
...
Closes https://github.com/BerriAI/litellm/issues/3580
2024-05-11 10:18:08 -07:00
Krish Dholakia
4f89f0d3a4
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
...
(fix) Fixed linting and other bugs with watsonx provider
2024-05-11 09:56:02 -07:00
Krish Dholakia
8ab9c861c9
Merge pull request #3369 from mogith-pn/main
...
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krish Dholakia
997ef2e480
Merge pull request #3507 from Manouchehri/oidc-3505-part-1
...
Initial OIDC support (Google/GitHub/CircleCI -> Amazon Bedrock & Azure OpenAI)
2024-05-11 09:25:17 -07:00
Ishaan Jaff
b02f633cd6
Merge pull request #3577 from BerriAI/litellm_add_triton_server
...
[Feat] Add Triton Embeddings to LiteLLM
2024-05-10 19:20:23 -07:00
Ishaan Jaff
82344db621
fix triton params
2024-05-10 19:14:48 -07:00
Krish Dholakia
859d978a77
Merge pull request #3571 from BerriAI/litellm_hf_classifier_support
...
Huggingface classifier support
2024-05-10 17:54:27 -07:00
Ishaan Jaff
64c9871583
fix langfuse logger re-initialized on all failure callbacks
2024-05-10 17:48:44 -07:00
Ishaan Jaff
4c0a1d3ec6
fix langfuse failure logging
2024-05-10 17:02:38 -07:00
Ishaan Jaff
472ad0b800
fix - support dynamic failure callbacks
2024-05-10 16:37:01 -07:00
Ishaan Jaff
b0777de041
fix - using failure callbacks with team based logging
2024-05-10 16:18:13 -07:00
Krrish Dholakia
4680f4e1db
test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi
2024-05-10 14:07:01 -07:00
Krrish Dholakia
03139e1769
fix(main.py): support env var 'VERTEX_PROJECT' and 'VERTEX_LOCATION'
2024-05-10 07:57:56 -07:00
Simon Sanchez Viloria
4e267fdaef
Merge branch 'main' into feature/watsonx-integration
2024-05-10 12:09:09 +02:00
Simon Sanchez Viloria
8e61b707c3
(fix) watsonx.py: Fixed linting errors and make sure stream chunk always return usage
2024-05-10 11:53:33 +02:00
Krish Dholakia
ddf09a3193
Merge pull request #3552 from BerriAI/litellm_predibase_support
...
feat(predibase.py): add support for predibase provider
2024-05-09 22:21:16 -07:00
Ishaan Jaff
a9aa71de01
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
...
[Feat] support `stream_options` on `litellm.text_completion`
2024-05-09 18:05:58 -07:00
Krrish Dholakia
7c0ab40fd5
feat(predibase.py): support async_completion + streaming (sync + async)
...
finishes up pr
2024-05-09 17:41:27 -07:00
Krrish Dholakia
f660d21743
feat(predibase.py): add support for predibase provider
...
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Krrish Dholakia
f10413e373
fix(utils.py): change error log to be debug
2024-05-09 13:58:45 -07:00
Ishaan Jaff
b0bcb74ba5
fix TextCompletionStreamWrapper
2024-05-09 09:54:44 -07:00
Ishaan Jaff
454dbdf285
feat - support stream_options for text completion
2024-05-09 08:42:25 -07:00
Ishaan Jaff
2968737969
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
...
[Feat] support `stream_options` param for OpenAI
2024-05-09 08:34:08 -07:00
Krrish Dholakia
40b1ee42ed
fix(get_api_base): fix get_api_base to handle model with alias
2024-05-09 08:01:17 -07:00
Krish Dholakia
8af4596dad
Revert "Add support for async streaming to watsonx provider "
2024-05-09 07:44:15 -07:00
Krish Dholakia
64ca2fde53
Merge branch 'main' into litellm_region_based_routing
2024-05-08 22:19:51 -07:00
Krish Dholakia
ffe255ea2b
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
...
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Krrish Dholakia
0ea8222508
feat(router.py): enable filtering model group by 'allowed_model_region'
2024-05-08 22:10:17 -07:00
Ishaan Jaff
8fb55507ad
support stream_options
2024-05-08 21:53:33 -07:00
Ishaan Jaff
fed005d853
Merge pull request #3534 from BerriAI/litellm_fix_cost_calc_bedrock
...
[Fix] `litellm.completion_cost(model="bedrock/anthropic.claude-instant-v1"..)`
2024-05-08 16:59:46 -07:00
Krrish Dholakia
5f93cae3ff
feat(proxy_server.py): return litellm version in response headers
2024-05-08 16:00:08 -07:00
Ishaan Jaff
6d71c1e44b
fix completion cost test
2024-05-08 15:51:30 -07:00
Ishaan Jaff
bbd8770260
fix - cost tracking - looking up bedrock pricing
2024-05-08 15:25:52 -07:00