Commit graph

2596 commits

Author SHA1 Message Date
Krrish Dholakia
280148543f fix(router.py): fix trailing slash handling for api base which contains /v1 2024-04-27 17:36:28 -07:00
Krrish Dholakia
ec19c1654b fix(router.py): set initial value of default litellm params to none 2024-04-27 17:22:50 -07:00
Krrish Dholakia
d9e0d7ce52 test: replace flaky endpoint 2024-04-27 16:37:09 -07:00
CyanideByte
a4c7d933a9 Added pytest for pydantic protected namespace warning 2024-04-27 15:44:40 -07:00
Krrish Dholakia
9f24421d44 fix(router.py): fix router should_retry 2024-04-27 15:13:20 -07:00
Krrish Dholakia
5e0bd5982e fix(router.py): fix sync should_retry logic 2024-04-27 14:48:07 -07:00
CyanideByte
e1786848cb protected_namespaces fixed for model_info 2024-04-27 13:08:45 -07:00
Ishaan Jaff
6762d07c7f
Merge pull request #3330 from BerriAI/litellm_rdct_msgs
[Feat] Redact Logging Messages/Response content on Logging Providers with `litellm.turn_off_message_logging=True`
2024-04-27 11:25:09 -07:00
Krish Dholakia
1a06f009d1
Merge branch 'main' into litellm_default_router_retries 2024-04-27 11:21:57 -07:00
Krrish Dholakia
2c67791663 test(test_completion.py): modify acompletion test to call pre-deployed watsonx endpoint 2024-04-27 11:19:00 -07:00
Krrish Dholakia
48f19cf839 feat(utils.py): unify common auth params across azure/vertex_ai/bedrock/watsonx 2024-04-27 11:06:18 -07:00
Ishaan Jaff
743dfdb950 test - redacting messages from langfuse 2024-04-27 10:03:34 -07:00
Krish Dholakia
2a006c3d39
Revert "Fix Anthropic Messages Prompt Template function to add a third condition: list of text-content dictionaries" 2024-04-27 08:57:18 -07:00
Krish Dholakia
2d976cfabc
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
2024-04-27 05:48:34 -07:00
Emir Ayar
2ecbf6663a Add test for completion with text content dictionaries 2024-04-27 12:27:12 +02:00
Krrish Dholakia
e05764bdb7 fix(router.py): add /v1/ if missing to base url, for openai-compatible api's
Fixes https://github.com/BerriAI/litellm/issues/2279
2024-04-26 17:05:07 -07:00
Krrish Dholakia
180718c33f fix(router.py): support verify_ssl flag
Fixes https://github.com/BerriAI/litellm/issues/3162#issuecomment-2075273807
2024-04-26 15:38:01 -07:00
Krrish Dholakia
7730520fb0 fix(router.py): allow passing httpx.timeout to timeout param in router
Closes https://github.com/BerriAI/litellm/issues/3162
2024-04-26 14:57:19 -07:00
Krish Dholakia
4b0f73500f
Merge branch 'main' into litellm_default_router_retries 2024-04-26 14:52:24 -07:00
Krrish Dholakia
9eb75cc159 test(test_streaming.py): fix test 2024-04-25 20:22:18 -07:00
Krrish Dholakia
5307510592 test: rename test 2024-04-25 20:07:40 -07:00
Krrish Dholakia
850b056df5 fix(utils.py): add more logging to identify ci/cd issue 2024-04-25 19:57:24 -07:00
Krish Dholakia
40b6b4794b
Merge pull request #3310 from BerriAI/litellm_langfuse_error_logging_2
fix(proxy/utils.py): log rejected proxy requests to langfuse
2024-04-25 19:49:59 -07:00
Krrish Dholakia
4c5398b556 test(test_timeout.py): fix test 2024-04-25 19:35:30 -07:00
Krrish Dholakia
885de2e3c6 fix(proxy/utils.py): log rejected proxy requests to langfuse 2024-04-25 19:26:27 -07:00
Krish Dholakia
69280177a3
Merge pull request #3308 from BerriAI/litellm_fix_streaming_n
fix(utils.py): fix the response object returned when n>1 for stream=true
2024-04-25 18:36:54 -07:00
Krrish Dholakia
1985231022 test(test_timeout.py): explicitly set num retries = 0 2024-04-25 18:06:25 -07:00
Krrish Dholakia
9f5ba67f5d fix(utils.py): return logprobs as an object not dict 2024-04-25 17:55:18 -07:00
Krrish Dholakia
54241f2551 test(test_router_fallbacks.py): fix testing 2024-04-25 17:43:40 -07:00
Ishaan Jaff
de6e03f410
Merge pull request #3307 from BerriAI/litellm_set_alerts_per_channel
[Backend-Alerting] Separate alerting for different channels
2024-04-25 16:35:16 -07:00
Krrish Dholakia
caf1e28ba3 test(test_completion.py): fix test 2024-04-25 14:07:07 -07:00
Krrish Dholakia
5f8d88d363 fix(vertex_ai.py): handle stream=false
also adds unit testing for vertex ai calls with langchain
2024-04-25 13:59:37 -07:00
Krrish Dholakia
a819454647 test(test_completion.py): fix test to not raise exception if it works 2024-04-25 13:31:19 -07:00
Krrish Dholakia
6c5c7cca3d fix(utils.py): fix the response object returned when n>1 for stream=true
Fixes https://github.com/BerriAI/litellm/issues/3276
2024-04-25 13:27:29 -07:00
Ishaan Jaff
2aa849b7ae fix test alerting 2024-04-25 13:06:17 -07:00
Krrish Dholakia
160acc085a fix(router.py): fix default retry logic 2024-04-25 11:57:27 -07:00
Krrish Dholakia
4f46b4c397 fix(factory.py): add replicate meta llama prompt templating support 2024-04-25 08:25:00 -07:00
Ishaan Jaff
74817c560e (ci/cd) run again 2024-04-24 23:23:14 -07:00
Ishaan Jaff
4e707af592 Revert "fix(router.py): fix max retries on set_client"
This reverts commit 821844c1a3.
2024-04-24 23:19:14 -07:00
Ishaan Jaff
13e0ac64ef (fix) updating router settings 2024-04-24 23:09:25 -07:00
Krrish Dholakia
821844c1a3 fix(router.py): fix max retries on set_client 2024-04-24 22:03:01 -07:00
Ishaan Jaff
242830108c (ci/cd) run again 2024-04-24 21:09:49 -07:00
Krish Dholakia
435a4b5ed4
Merge pull request #3267 from BerriAI/litellm_openai_streaming_fix
fix(utils.py): fix streaming to not return usage dict
2024-04-24 21:08:33 -07:00
Ishaan Jaff
2c7f4695d9
Merge pull request #3283 from BerriAI/litellm_debug_lowest_latency
[Fix] Add better observability for debugging lowest latency routing
2024-04-24 20:42:52 -07:00
Krrish Dholakia
df7db2b870 fix(factory.py): support llama3 instuct chat template
allows automatic templating for llama3 instruct requests
2024-04-24 20:35:10 -07:00
Krrish Dholakia
495aebb582 fix(utils.py): fix setattr error 2024-04-24 20:19:27 -07:00
Ishaan Jaff
2e6fc91a75 test - lowest latency logger 2024-04-24 16:35:43 -07:00
Ishaan Jaff
36f8431bf0 (ci/cd) testing 2024-04-24 13:25:18 -07:00
Ishaan Jaff
efbf85a5ad /model/update endpoint 2024-04-24 10:39:20 -07:00
Krrish Dholakia
48c2c3d78a fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
2024-04-24 08:06:07 -07:00