Commit graph

11479 commits

Author SHA1 Message Date
Ishaan Jaff
754e10f3a4 fix - azure content safety testing does not work 2024-05-11 17:50:27 -07:00
Ishaan Jaff
fa28e69c35 fix test azure_content_safety 2024-05-11 17:48:05 -07:00
Ishaan Jaff
7a6df1a0ab fix - failing_AzureContentSafety tests 2024-05-11 17:39:06 -07:00
Ishaan Jaff
ed8a25c630 tests - unit test router retry logic 2024-05-11 17:31:01 -07:00
Ishaan Jaff
104fd4d048 router - clean up should_retry_this_error 2024-05-11 17:30:21 -07:00
Ishaan Jaff
9c4f1ec3e5 fix - failing test_end_user_specific_region test 2024-05-11 17:05:37 -07:00
Ishaan Jaff
18c2da213a retry logic on router 2024-05-11 17:04:19 -07:00
Marc Abramowitz
b1bf49f0a1 Make test_load_router_config pass
by mocking the necessary things in the test.

Now all the tests in `test_proxy_server.py` pass! 🎉

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py --disable-warnings
====================================== test session starts ======================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, asyncio-0.23.6, mock-3.14.0
asyncio: mode=Mode.STRICT
collected 12 items

litellm/tests/test_proxy_server.py s..........s                                           [100%]

========================== 10 passed, 2 skipped, 48 warnings in 10.70s ==========================
```
2024-05-11 16:55:57 -07:00
Ishaan Jaff
f0c727a597 fix clarifai - test 2024-05-11 16:54:22 -07:00
Krrish Dholakia
49ab1a1d3f fix(bedrock_httpx.py): working async bedrock command r calls 2024-05-11 16:45:20 -07:00
Ishaan Jaff
2e9f4ff23a fix oidc tests 2024-05-11 16:31:38 -07:00
Ishaan Jaff
d77aea7253
Update test_bedrock_completion.py
cc @Manouchehri - can u lmk what needs to be in our env to pass this test ? 

attaching the test log here: cda0de1d-3851-469c-8851-ef12dc27fab2/jobs/20819/tests#failed-test-0
2024-05-11 16:30:29 -07:00
Ishaan Jaff
732d4496fe bump: version 1.37.4 → 1.37.5 2024-05-11 16:07:43 -07:00
Ishaan Jaff
0887e9cc0d fix - oidc provider on python3.8 2024-05-11 16:01:34 -07:00
Ishaan Jaff
d7f7120880 ui - new build 2024-05-11 15:58:55 -07:00
Ishaan Jaff
19a65ea75b
Merge pull request #3588 from msabramo/msabramo/test_proxy_server_client_no_auth_fake_env_vars
Set fake env vars for `client_no_auth` fixture
2024-05-11 15:57:28 -07:00
Ishaan Jaff
91a6a0eef4 (Fix) - linting errors 2024-05-11 15:57:06 -07:00
Marc Abramowitz
9167ff0d75 Set fake env vars for client_no_auth fixture
This allows all of the tests in `test_proxy_server.py` to pass, with the
exception of `test_load_router_config`, without needing to set up real
environment variables.

Before:

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py -k 'not test_load_router_config' --disable-warnings
...
========================================================== short test summary info ===========================================================
ERROR litellm/tests/test_proxy_server.py::test_bedrock_embedding - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion_azure - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion_optional_params - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_embedding - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_engines_model_chat_completions - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_health - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_img_gen - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_openai_deployments_model_chat_completions_azure - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
========================================== 2 skipped, 1 deselected, 39 warnings, 9 errors in 3.24s ===========================================
```

After:

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py -k 'not test_load_router_config' --disable-warnings
============================================================ test session starts =============================================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, asyncio-0.23.6, mock-3.14.0
asyncio: mode=Mode.STRICT
collected 12 items / 1 deselected / 11 selected

litellm/tests/test_proxy_server.py s.........s                                                                                         [100%]

========================================== 9 passed, 2 skipped, 1 deselected, 48 warnings in 8.42s ===========================================
```
2024-05-11 15:22:30 -07:00
Krrish Dholakia
59c8c0adff feat(bedrock_httpx.py): working cohere command r async calls 2024-05-11 15:04:38 -07:00
Ishaan Jaff
25febe41c4 docs - using batch completions with python 2024-05-11 14:37:32 -07:00
Ishaan Jaff
bf2194d7fc feat - support model as csv on proxy 2024-05-11 14:27:20 -07:00
Ishaan Jaff
d4288b134b fix - use csv list for batch completions 2024-05-11 14:24:48 -07:00
Ishaan Jaff
b9b8bf52f3
Merge pull request #3581 from BerriAI/litellm_log_metadata_langfuse_traces
[Feat] - log metadata on traces + allow users to log metadata when `existing_trace_id` exists
2024-05-11 14:19:48 -07:00
Ishaan Jaff
a41bef5297 debug langfuse 2024-05-11 14:12:26 -07:00
Ishaan Jaff
360d284058 docs - debug langfuse 2024-05-11 14:12:17 -07:00
Ishaan Jaff
1bf8e7ac75 fix langfuse debug mode 2024-05-11 14:08:39 -07:00
Ishaan Jaff
97c81a5c7e fix langfuse test 2024-05-11 14:03:40 -07:00
Ishaan Jaff
038522ab24 fix - support debugging litellm params 2024-05-11 14:02:16 -07:00
Ishaan Jaff
bf909a89f8
Merge pull request #3585 from BerriAI/litellm_router_batch_comp
[Litellm Proxy + litellm.Router] - Pass the same message/prompt to N models
2024-05-11 13:51:45 -07:00
Ishaan Jaff
71564895ae
Merge pull request #3583 from BerriAI/litellm_show_token_hash_ui
[UI] Show Token ID/Hash on Admin UI
2024-05-11 13:45:50 -07:00
Ishaan Jaff
62276fc221 docs link to litellm batch completions 2024-05-11 13:45:32 -07:00
Krrish Dholakia
4a3b084961 feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls 2024-05-11 13:43:08 -07:00
Ishaan Jaff
5918ee543b docs - batch completion litellm proxy 2024-05-11 13:42:41 -07:00
Ishaan Jaff
e1f94fcbbb test batch completions on litellm proxy 2024-05-11 13:32:30 -07:00
Ishaan Jaff
31cb1be279 edit dev config.yaml 2024-05-11 13:24:59 -07:00
Ishaan Jaff
b8c7bbcb9f support batch /chat/completions on proxy 2024-05-11 13:24:25 -07:00
Ishaan Jaff
6561e0838e test - router.batch_acompletion 2024-05-11 13:09:17 -07:00
Ishaan Jaff
9156b7448a feat - router async batch acompletion 2024-05-11 13:08:16 -07:00
Ishaan Jaff
69afc14a82 fix - show token hashes on ui 2024-05-11 12:42:14 -07:00
Krish Dholakia
86d0c0ae4e
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
feat(router.py): allow setting model_region in litellm_params
2024-05-11 11:36:22 -07:00
Krrish Dholakia
0c87bb5adf docs(reliability.md): add region based routing to proxy + sdk docs 2024-05-11 11:34:12 -07:00
Ishaan Jaff
b146336e79 clean up key info tab 2024-05-11 11:30:10 -07:00
Ishaan Jaff
69452f003d ui - show token hashes on ui 2024-05-11 11:21:53 -07:00
Krrish Dholakia
6714854bb7 feat(router.py): support region routing for bedrock, vertex ai, watsonx 2024-05-11 11:04:00 -07:00
Ishaan Jaff
6577719bf8 fix - langfuse trace 2024-05-11 10:28:13 -07:00
Ishaan Jaff
97ba230b7a fix langfuse test 2024-05-11 10:20:30 -07:00
Ishaan Jaff
ebb5c76e37 fix langfuse log clean metadata 2024-05-11 10:19:02 -07:00
Krrish Dholakia
ebc927f1c8 feat(router.py): allow setting model_region in litellm_params
Closes https://github.com/BerriAI/litellm/issues/3580
2024-05-11 10:18:08 -07:00
Ishaan Jaff
e83743f8e1 fix langfuse - log metadata on traces 2024-05-11 09:59:05 -07:00
Krish Dholakia
d33e49411d
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
(fix) Fixed linting and other bugs with watsonx provider
2024-05-11 09:56:02 -07:00