Krish Dholakia
5b390b6512
Merge pull request #3602 from msabramo/msabramo/fix_pkg_resources_warning
...
Fix `pkg_resources` warning
2024-05-13 21:59:52 -07:00
Krrish Dholakia
155f1f164f
refactor(utils.py): trigger local_testing
2024-05-13 18:18:22 -07:00
Krrish Dholakia
29449aa5c1
fix(utils.py): fix watsonx exception mapping
2024-05-13 18:13:13 -07:00
Krrish Dholakia
d7c28509d7
fix(utils.py): watsonx ai exception mapping fix
2024-05-13 17:11:33 -07:00
Krrish Dholakia
240c9550f0
fix(utils.py): handle api assistant returning 'null' role
...
Fixes https://github.com/BerriAI/litellm/issues/3621
2024-05-13 16:46:07 -07:00
Krrish Dholakia
b4a8665d11
fix(utils.py): fix custom pricing when litellm model != response obj model name
2024-05-13 15:25:35 -07:00
Krrish Dholakia
20456968e9
fix(openai.py): creat MistralConfig with response_format mapping for mistral api
2024-05-13 13:29:58 -07:00
Krrish Dholakia
39e4927752
fix(utils.py): fix vertex ai function calling + streaming
...
Completes https://github.com/BerriAI/litellm/issues/3147
2024-05-13 12:32:39 -07:00
Marc Abramowitz
bfaf8d033d
Fix pkg_resources warning
...
by trying to use `importlib.resources` first and falling back to
`pkg_resources` if that fails.
With this and the changes in GH-3600 and GH-3601, the tests pass with **zero
warnings**!! 🎉 🎉
```shell
abramowi at marcs-mbp-3 in ~/Code/OpenSource/litellm (msabramo/fix-pydantic-warnings●●)
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py
====================================== test session starts ======================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, mock-3.14.0
collected 12 items
litellm/tests/test_proxy_server.py s..........s [100%]
================================= 10 passed, 2 skipped in 9.24s =================================
```
2024-05-12 12:46:24 -07:00
Krish Dholakia
1d651c6049
Merge branch 'main' into litellm_bedrock_command_r_support
2024-05-11 21:24:42 -07:00
Krrish Dholakia
15ba244e46
fix(utils.py): correctly exception map 'request too large' as rate limit error
2024-05-11 20:20:34 -07:00
Krrish Dholakia
64650c0279
feat(bedrock_httpx.py): working bedrock command-r sync+async streaming
2024-05-11 19:39:51 -07:00
Ishaan Jaff
0887e9cc0d
fix - oidc provider on python3.8
2024-05-11 16:01:34 -07:00
Ishaan Jaff
91a6a0eef4
(Fix) - linting errors
2024-05-11 15:57:06 -07:00
Krrish Dholakia
4a3b084961
feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls
2024-05-11 13:43:08 -07:00
Krish Dholakia
86d0c0ae4e
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
...
feat(router.py): allow setting model_region in litellm_params
2024-05-11 11:36:22 -07:00
Krrish Dholakia
6714854bb7
feat(router.py): support region routing for bedrock, vertex ai, watsonx
2024-05-11 11:04:00 -07:00
Krrish Dholakia
ebc927f1c8
feat(router.py): allow setting model_region in litellm_params
...
Closes https://github.com/BerriAI/litellm/issues/3580
2024-05-11 10:18:08 -07:00
Krish Dholakia
d33e49411d
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
...
(fix) Fixed linting and other bugs with watsonx provider
2024-05-11 09:56:02 -07:00
Krish Dholakia
8f6ae9a059
Merge pull request #3369 from mogith-pn/main
...
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krish Dholakia
40063798bd
Merge pull request #3507 from Manouchehri/oidc-3505-part-1
...
Initial OIDC support (Google/GitHub/CircleCI -> Amazon Bedrock & Azure OpenAI)
2024-05-11 09:25:17 -07:00
Ishaan Jaff
b09075da53
Merge pull request #3577 from BerriAI/litellm_add_triton_server
...
[Feat] Add Triton Embeddings to LiteLLM
2024-05-10 19:20:23 -07:00
Ishaan Jaff
ed2c05d10d
fix triton params
2024-05-10 19:14:48 -07:00
Krish Dholakia
1aa567f3b5
Merge pull request #3571 from BerriAI/litellm_hf_classifier_support
...
Huggingface classifier support
2024-05-10 17:54:27 -07:00
Ishaan Jaff
1d25be0ca8
fix langfuse logger re-initialized on all failure callbacks
2024-05-10 17:48:44 -07:00
Ishaan Jaff
ce8523808b
fix langfuse failure logging
2024-05-10 17:02:38 -07:00
Ishaan Jaff
53f9d8280f
fix - support dynamic failure callbacks
2024-05-10 16:37:01 -07:00
Ishaan Jaff
b6e0f00ed8
fix - using failure callbacks with team based logging
2024-05-10 16:18:13 -07:00
Krrish Dholakia
c17f221b89
test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi
2024-05-10 14:07:01 -07:00
Krrish Dholakia
9a31f3d3d9
fix(main.py): support env var 'VERTEX_PROJECT' and 'VERTEX_LOCATION'
2024-05-10 07:57:56 -07:00
Simon Sanchez Viloria
e1372de9ee
Merge branch 'main' into feature/watsonx-integration
2024-05-10 12:09:09 +02:00
Simon Sanchez Viloria
170fd11c82
(fix) watsonx.py: Fixed linting errors and make sure stream chunk always return usage
2024-05-10 11:53:33 +02:00
Krish Dholakia
a671046b45
Merge pull request #3552 from BerriAI/litellm_predibase_support
...
feat(predibase.py): add support for predibase provider
2024-05-09 22:21:16 -07:00
Ishaan Jaff
5eb12e30cc
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
...
[Feat] support `stream_options` on `litellm.text_completion`
2024-05-09 18:05:58 -07:00
Krrish Dholakia
d7189c21fd
feat(predibase.py): support async_completion + streaming (sync + async)
...
finishes up pr
2024-05-09 17:41:27 -07:00
Krrish Dholakia
186c0ec77b
feat(predibase.py): add support for predibase provider
...
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Krrish Dholakia
acb615957d
fix(utils.py): change error log to be debug
2024-05-09 13:58:45 -07:00
Ishaan Jaff
6634ea37e9
fix TextCompletionStreamWrapper
2024-05-09 09:54:44 -07:00
Ishaan Jaff
e0b1eff1eb
feat - support stream_options for text completion
2024-05-09 08:42:25 -07:00
Ishaan Jaff
0b1885ca99
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
...
[Feat] support `stream_options` param for OpenAI
2024-05-09 08:34:08 -07:00
Krrish Dholakia
4cfd988529
fix(get_api_base): fix get_api_base to handle model with alias
2024-05-09 08:01:17 -07:00
Krish Dholakia
8015bc1c47
Revert "Add support for async streaming to watsonx provider "
2024-05-09 07:44:15 -07:00
Krish Dholakia
8ad979cdfe
Merge branch 'main' into litellm_region_based_routing
2024-05-08 22:19:51 -07:00
Krish Dholakia
3f13251241
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
...
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Krrish Dholakia
3d18897d69
feat(router.py): enable filtering model group by 'allowed_model_region'
2024-05-08 22:10:17 -07:00
Ishaan Jaff
80ca011a64
support stream_options
2024-05-08 21:53:33 -07:00
Ishaan Jaff
41a4a06389
Merge pull request #3534 from BerriAI/litellm_fix_cost_calc_bedrock
...
[Fix] `litellm.completion_cost(model="bedrock/anthropic.claude-instant-v1"..)`
2024-05-08 16:59:46 -07:00
Krrish Dholakia
6575143460
feat(proxy_server.py): return litellm version in response headers
2024-05-08 16:00:08 -07:00
Ishaan Jaff
33d6caa889
fix completion cost test
2024-05-08 15:51:30 -07:00
Ishaan Jaff
8348c671a9
fix - cost tracking - looking up bedrock pricing
2024-05-08 15:25:52 -07:00