Commit graph

3006 commits

Author SHA1 Message Date
Ishaan Jaff
2e9f4ff23a fix oidc tests 2024-05-11 16:31:38 -07:00
Ishaan Jaff
d77aea7253
Update test_bedrock_completion.py
cc @Manouchehri - can u lmk what needs to be in our env to pass this test ? 

attaching the test log here: cda0de1d-3851-469c-8851-ef12dc27fab2/jobs/20819/tests#failed-test-0
2024-05-11 16:30:29 -07:00
Ishaan Jaff
19a65ea75b
Merge pull request #3588 from msabramo/msabramo/test_proxy_server_client_no_auth_fake_env_vars
Set fake env vars for `client_no_auth` fixture
2024-05-11 15:57:28 -07:00
Marc Abramowitz
9167ff0d75 Set fake env vars for client_no_auth fixture
This allows all of the tests in `test_proxy_server.py` to pass, with the
exception of `test_load_router_config`, without needing to set up real
environment variables.

Before:

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py -k 'not test_load_router_config' --disable-warnings
...
========================================================== short test summary info ===========================================================
ERROR litellm/tests/test_proxy_server.py::test_bedrock_embedding - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion_azure - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_chat_completion_optional_params - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_embedding - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_engines_model_chat_completions - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_health - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_img_gen - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
ERROR litellm/tests/test_proxy_server.py::test_openai_deployments_model_chat_completions_azure - openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY enviro...
========================================== 2 skipped, 1 deselected, 39 warnings, 9 errors in 3.24s ===========================================
```

After:

```shell
$ env -i PATH=$PATH poetry run pytest litellm/tests/test_proxy_server.py -k 'not test_load_router_config' --disable-warnings
============================================================ test session starts =============================================================
platform darwin -- Python 3.12.3, pytest-7.4.4, pluggy-1.5.0
rootdir: /Users/abramowi/Code/OpenSource/litellm
plugins: anyio-4.3.0, asyncio-0.23.6, mock-3.14.0
asyncio: mode=Mode.STRICT
collected 12 items / 1 deselected / 11 selected

litellm/tests/test_proxy_server.py s.........s                                                                                         [100%]

========================================== 9 passed, 2 skipped, 1 deselected, 48 warnings in 8.42s ===========================================
```
2024-05-11 15:22:30 -07:00
Krrish Dholakia
59c8c0adff feat(bedrock_httpx.py): working cohere command r async calls 2024-05-11 15:04:38 -07:00
Ishaan Jaff
b9b8bf52f3
Merge pull request #3581 from BerriAI/litellm_log_metadata_langfuse_traces
[Feat] - log metadata on traces + allow users to log metadata when `existing_trace_id` exists
2024-05-11 14:19:48 -07:00
Ishaan Jaff
97c81a5c7e fix langfuse test 2024-05-11 14:03:40 -07:00
Ishaan Jaff
bf909a89f8
Merge pull request #3585 from BerriAI/litellm_router_batch_comp
[Litellm Proxy + litellm.Router] - Pass the same message/prompt to N models
2024-05-11 13:51:45 -07:00
Krrish Dholakia
4a3b084961 feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls 2024-05-11 13:43:08 -07:00
Ishaan Jaff
6561e0838e test - router.batch_acompletion 2024-05-11 13:09:17 -07:00
Krish Dholakia
86d0c0ae4e
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
feat(router.py): allow setting model_region in litellm_params
2024-05-11 11:36:22 -07:00
Ishaan Jaff
6577719bf8 fix - langfuse trace 2024-05-11 10:28:13 -07:00
Ishaan Jaff
97ba230b7a fix langfuse test 2024-05-11 10:20:30 -07:00
Krrish Dholakia
ebc927f1c8 feat(router.py): allow setting model_region in litellm_params
Closes https://github.com/BerriAI/litellm/issues/3580
2024-05-11 10:18:08 -07:00
Krish Dholakia
d33e49411d
Merge pull request #3561 from simonsanvil/feature/watsonx-integration
(fix) Fixed linting and other bugs with watsonx provider
2024-05-11 09:56:02 -07:00
Krish Dholakia
8f6ae9a059
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krish Dholakia
bbe1300c5b
Merge branch 'main' into feat/add-azure-content-filter 2024-05-11 09:30:38 -07:00
Krish Dholakia
3ee61350ed
Merge pull request #3424 from lunary-ai/main
Fix tool calls tracking with Lunary
2024-05-11 09:26:26 -07:00
Krish Dholakia
40063798bd
Merge pull request #3507 from Manouchehri/oidc-3505-part-1
Initial OIDC support (Google/GitHub/CircleCI -> Amazon Bedrock & Azure OpenAI)
2024-05-11 09:25:17 -07:00
Krrish Dholakia
94f3d361b0 fix(vertex_ai.py): fix list tool call responses
Closes https://github.com/BerriAI/litellm/issues/3147
2024-05-10 20:05:58 -07:00
Ishaan Jaff
2c4604d90f (ci/cd) run again 2024-05-10 19:22:13 -07:00
Ishaan Jaff
b09075da53
Merge pull request #3577 from BerriAI/litellm_add_triton_server
[Feat] Add Triton Embeddings to LiteLLM
2024-05-10 19:20:23 -07:00
Ishaan Jaff
0cde9473c9 test triton embeddings 2024-05-10 18:50:34 -07:00
Krish Dholakia
1aa567f3b5
Merge pull request #3571 from BerriAI/litellm_hf_classifier_support
Huggingface classifier support
2024-05-10 17:54:27 -07:00
Ishaan Jaff
e3848abdfe
Merge pull request #3569 from BerriAI/litellm_fix_bug_upsert_deployments
[Fix] Upsert deployment bug
2024-05-10 16:53:59 -07:00
Ishaan Jaff
1a8e853817 (ci/cd) run again 2024-05-10 16:19:03 -07:00
Krrish Dholakia
6a400a6200 test: fix test 2024-05-10 15:49:20 -07:00
Krrish Dholakia
500995696a test: fix linting 2024-05-10 14:42:06 -07:00
Krrish Dholakia
d4d175030f docs(huggingface.md): add text-classification to huggingface docs 2024-05-10 14:39:14 -07:00
Krrish Dholakia
50be25d11a test(test_optional_params.py): fix optional params 2024-05-10 14:08:47 -07:00
Krrish Dholakia
c17f221b89 test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi 2024-05-10 14:07:01 -07:00
Krrish Dholakia
781d5888c3 docs(predibase.md): add support for predibase to docs 2024-05-10 10:58:35 -07:00
Krrish Dholakia
cdec7a414f test(test_router_fallbacks.py): fix test 2024-05-10 09:58:40 -07:00
Krrish Dholakia
9a31f3d3d9 fix(main.py): support env var 'VERTEX_PROJECT' and 'VERTEX_LOCATION' 2024-05-10 07:57:56 -07:00
Simon Sanchez Viloria
e1372de9ee Merge branch 'main' into feature/watsonx-integration 2024-05-10 12:09:09 +02:00
Simon Sanchez Viloria
d3d82827ed (test) Add tests for WatsonX completion/acompletion streaming 2024-05-10 11:55:58 +02:00
Antonio Loison
7ee07cd961 test(test_caching.py): use mock_response in disk cache test 2024-05-10 11:00:18 +02:00
Antonio Loison
ac27f431a4 test(test_caching.py): add disk cache test when using completion 2024-05-10 10:03:38 +02:00
Krish Dholakia
a671046b45
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
2024-05-09 22:21:16 -07:00
Ishaan Jaff
5eb12e30cc
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
[Feat] support `stream_options` on `litellm.text_completion`
2024-05-09 18:05:58 -07:00
Krrish Dholakia
d7189c21fd feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
2024-05-09 17:41:27 -07:00
Krrish Dholakia
186c0ec77b feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
2024-05-09 16:39:43 -07:00
Krrish Dholakia
c4295e1667 test(test_least_busy_routing.py): avoid deployments with low rate limits 2024-05-09 13:54:24 -07:00
Krrish Dholakia
e3f25a4a1f fix(auth_checks.py): fix 'get_end_user_object'
await cache get
2024-05-09 13:05:56 -07:00
Ishaan Jaff
a29fcc057b test - stream_options on OpenAI text_completion 2024-05-09 08:41:31 -07:00
Ishaan Jaff
0b1885ca99
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
2024-05-09 08:34:08 -07:00
Krrish Dholakia
4cfd988529 fix(get_api_base): fix get_api_base to handle model with alias 2024-05-09 08:01:17 -07:00
Ishaan Jaff
dfd6361310 fix completion vs acompletion params 2024-05-09 07:59:37 -07:00
Ishaan Jaff
f2965660dd test openai stream_options 2024-05-08 21:52:39 -07:00
Ishaan Jaff
282b8d0ae4 test bedrock pricing 2024-05-08 15:26:53 -07:00