Commit graph

11876 commits

Author SHA1 Message Date
Ishaan Jaff
d3371fc81d fix langfuse logging metadata 2024-05-11 20:39:44 -07:00
Ishaan Jaff
2b3414c667 ci/cd run again 2024-05-11 20:34:55 -07:00
Krrish Dholakia
15ba244e46 fix(utils.py): correctly exception map 'request too large' as rate limit error 2024-05-11 20:20:34 -07:00
Krrish Dholakia
a456f6bf2b fix(anthropic.py): fix tool calling + streaming issue 2024-05-11 20:15:36 -07:00
Krrish Dholakia
83beb41096 fix(anthropic_text.py): fix linting error 2024-05-11 20:01:50 -07:00
Ishaan Jaff
beac60ed12 test - router retry policy 2024-05-11 19:58:17 -07:00
Krrish Dholakia
65d0be85fc fix(bedrock_httpx.py): compatibility fix 2024-05-11 19:55:38 -07:00
Krrish Dholakia
f6c84f1aa6 fix(anthropic.py): compatibility fix 2024-05-11 19:51:29 -07:00
Krrish Dholakia
6d67d6d5ad fix(types/bedrock.py): linting fix 2024-05-11 19:49:46 -07:00
Ishaan Jaff
61a3e5d5a9 fix get healthy deployments 2024-05-11 19:46:35 -07:00
Krrish Dholakia
ae0c061b46 fix(anthropic.py): fix version compatibility 2024-05-11 19:46:26 -07:00
Krrish Dholakia
b1448cd244 test(test_streaming.py): fix test 2024-05-11 19:44:47 -07:00
Krrish Dholakia
2f3fd3e2f0 fix(anthropic.py): fix linting error 2024-05-11 19:42:14 -07:00
Krrish Dholakia
64650c0279 feat(bedrock_httpx.py): working bedrock command-r sync+async streaming 2024-05-11 19:39:51 -07:00
Ishaan Jaff
04ac352407 test fix - test_async_fallbacks_embeddings 2024-05-11 19:20:24 -07:00
Ishaan Jaff
7930653872 fix - test router fallbacks 2024-05-11 19:13:22 -07:00
Ishaan Jaff
32e445c59d fix - unit tests for router retries 2024-05-11 19:10:33 -07:00
Ishaan Jaff
e0d1f96544 test router - fallbacks 2024-05-11 19:08:31 -07:00
Ishaan Jaff
4d648a6d89 fix - _time_to_sleep_before_retry 2024-05-11 19:08:10 -07:00
Ishaan Jaff
bfcb640d21
Merge pull request #3590 from BerriAI/litellm_router_retry_logic
[Feat] Proxy + Router - Retry on RateLimitErrors when fallbacks, other deployments exists
2024-05-11 18:21:12 -07:00
Ishaan Jaff
c56b44f779 fix failing azure content safety errors 2024-05-11 18:19:00 -07:00
Ishaan Jaff
a978326c99 unify sync and async logic for retries 2024-05-11 18:17:04 -07:00
Ishaan Jaff
4e844d7438 test - unit tests for time to sleep when there are rate limit errors 2024-05-11 18:13:28 -07:00
Ishaan Jaff
6e39760779 fix _time_to_sleep_before_retry 2024-05-11 18:05:12 -07:00
Ishaan Jaff
3e6097d9f8 fix _time_to_sleep_before_retry logic 2024-05-11 18:00:02 -07:00
Ishaan Jaff
2eb4508204 fix mark (BETA) Azure Content Safety 2024-05-11 17:51:21 -07:00
Ishaan Jaff
754e10f3a4 fix - azure content safety testing does not work 2024-05-11 17:50:27 -07:00
Ishaan Jaff
fa28e69c35 fix test azure_content_safety 2024-05-11 17:48:05 -07:00
Ishaan Jaff
7a6df1a0ab fix - failing_AzureContentSafety tests 2024-05-11 17:39:06 -07:00
Ishaan Jaff
ed8a25c630 tests - unit test router retry logic 2024-05-11 17:31:01 -07:00
Ishaan Jaff
104fd4d048 router - clean up should_retry_this_error 2024-05-11 17:30:21 -07:00
Ishaan Jaff
9c4f1ec3e5 fix - failing test_end_user_specific_region test 2024-05-11 17:05:37 -07:00
Ishaan Jaff
18c2da213a retry logic on router 2024-05-11 17:04:19 -07:00
Marc Abramowitz
b1bf49f0a1 Make test_load_router_config pass
by mocking the necessary things in the test.

Now all the tests in `test_proxy_server.py` pass! 🎉

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py --disable-warnings
====================================== test session starts ======================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, asyncio-0.23.6, mock-3.14.0
asyncio: mode=Mode.STRICT
collected 12 items

litellm/tests/test_proxy_server.py s..........s                                           [100%]

========================== 10 passed, 2 skipped, 48 warnings in 10.70s ==========================
```
2024-05-11 16:55:57 -07:00
Ishaan Jaff
f0c727a597 fix clarifai - test 2024-05-11 16:54:22 -07:00
Krrish Dholakia
49ab1a1d3f fix(bedrock_httpx.py): working async bedrock command r calls 2024-05-11 16:45:20 -07:00
Ishaan Jaff
2e9f4ff23a fix oidc tests 2024-05-11 16:31:38 -07:00
Ishaan Jaff
d77aea7253
Update test_bedrock_completion.py
cc @Manouchehri - can u lmk what needs to be in our env to pass this test ? 

attaching the test log here: cda0de1d-3851-469c-8851-ef12dc27fab2/jobs/20819/tests#failed-test-0
2024-05-11 16:30:29 -07:00
Ishaan Jaff
732d4496fe bump: version 1.37.4 → 1.37.5 2024-05-11 16:07:43 -07:00
Ishaan Jaff
0887e9cc0d fix - oidc provider on python3.8 2024-05-11 16:01:34 -07:00
Ishaan Jaff
d7f7120880 ui - new build 2024-05-11 15:58:55 -07:00
Ishaan Jaff
19a65ea75b
Merge pull request #3588 from msabramo/msabramo/test_proxy_server_client_no_auth_fake_env_vars
Set fake env vars for `client_no_auth` fixture
2024-05-11 15:57:28 -07:00
Ishaan Jaff
91a6a0eef4 (Fix) - linting errors 2024-05-11 15:57:06 -07:00
Marc Abramowitz
9167ff0d75 Set fake env vars for client_no_auth fixture
This allows all of the tests in `test_proxy_server.py` to pass, with the
exception of `test_load_router_config`, without needing to set up real
environment variables.

Before:

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py -k 'not test_load_router_config' --disable-warnings
...
========================================================== short test summary info ===========================================================
ERROR litellm/tests/test_proxy_server.py::test_bedrock_embedding - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion_azure - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion_optional_params - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_embedding - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_engines_model_chat_completions - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_health - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_img_gen - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_openai_deployments_model_chat_completions_azure - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
========================================== 2 skipped, 1 deselected, 39 warnings, 9 errors in 3.24s ===========================================
```

After:

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py -k 'not test_load_router_config' --disable-warnings
============================================================ test session starts =============================================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, asyncio-0.23.6, mock-3.14.0
asyncio: mode=Mode.STRICT
collected 12 items / 1 deselected / 11 selected

litellm/tests/test_proxy_server.py s.........s                                                                                         [100%]

========================================== 9 passed, 2 skipped, 1 deselected, 48 warnings in 8.42s ===========================================
```
2024-05-11 15:22:30 -07:00
Krrish Dholakia
59c8c0adff feat(bedrock_httpx.py): working cohere command r async calls 2024-05-11 15:04:38 -07:00
Ishaan Jaff
25febe41c4 docs - using batch completions with python 2024-05-11 14:37:32 -07:00
Ishaan Jaff
bf2194d7fc feat - support model as csv on proxy 2024-05-11 14:27:20 -07:00
Ishaan Jaff
d4288b134b fix - use csv list for batch completions 2024-05-11 14:24:48 -07:00
Ishaan Jaff
b9b8bf52f3
Merge pull request #3581 from BerriAI/litellm_log_metadata_langfuse_traces
[Feat] - log metadata on traces + allow users to log metadata when `existing_trace_id` exists
2024-05-11 14:19:48 -07:00
Ishaan Jaff
a41bef5297 debug langfuse 2024-05-11 14:12:26 -07:00