wslee
|
fe8d59f5eb
|
add friendli_ai provider
|
2024-06-10 17:27:15 +09:00 |
|
Ishaan Jaff
|
39bbc5d8ac
|
Merge pull request #4086 from BerriAI/litellm_sdk_tool_calling_fic
[Fix] Litellm sdk - allow ChatCompletionMessageToolCall, and Function to be used as dict
|
2024-06-08 20:48:54 -07:00 |
|
Krish Dholakia
|
b4fc4abb76
|
Merge pull request #4080 from BerriAI/litellm_predibase_exception_mapping
fix(utils.py): improved predibase exception mapping
|
2024-06-08 20:27:44 -07:00 |
|
Ishaan Jaff
|
0253c2b213
|
feat - allow ChatCompletionMessageToolCall, and Function to be used as dict
|
2024-06-08 19:47:31 -07:00 |
|
Krrish Dholakia
|
b26c3c7d22
|
fix(cost_calculator.py): fixes tgai unmapped model pricing
Fixes error where tgai helper function returned None. Enforces stronger type hints, refactors code, adds more unit testing.
|
2024-06-08 19:43:57 -07:00 |
|
Krrish Dholakia
|
1dafb1b3b7
|
fix(utils.py): improved predibase exception mapping
adds unit testing + better coverage for predibase errors
|
2024-06-08 14:32:43 -07:00 |
|
Krrish Dholakia
|
93a3a0cc1e
|
fix(utils.py): fix helicone success logging integration
Fixes https://github.com/BerriAI/litellm/issues/4062
|
2024-06-08 08:59:56 -07:00 |
|
Ishaan Jaff
|
366fc5e40b
|
fix - vertex ai exception mapping
|
2024-06-07 18:16:26 -07:00 |
|
Ishaan Jaff
|
d5e97861ee
|
fix vertex ai exceptions
|
2024-06-07 17:13:32 -07:00 |
|
Ishaan Jaff
|
718b547646
|
Merge branch 'main' into litellm_security_fix
|
2024-06-07 16:52:25 -07:00 |
|
Krrish Dholakia
|
5c46b386d0
|
fix(utils.py): fix vertex ai exception mapping
|
2024-06-07 16:06:31 -07:00 |
|
Krrish Dholakia
|
f73b6033fd
|
fix(test_custom_callbacks_input.py): unit tests for 'turn_off_message_logging'
ensure no raw request is logged either
|
2024-06-07 15:39:15 -07:00 |
|
Ishaan Jaff
|
d9dacc1f43
|
Merge pull request #4065 from BerriAI/litellm_use_common_func
[Refactor] - Refactor proxy_server.py to use common function for `add_litellm_data_to_request`
|
2024-06-07 14:02:17 -07:00 |
|
Ishaan Jaff
|
2cf3133669
|
Merge branch 'main' into litellm_svc_logger
|
2024-06-07 14:01:54 -07:00 |
|
Ishaan Jaff
|
7ef7bc8a9a
|
fix simplify - pass litellm_parent_otel_span
|
2024-06-07 13:48:21 -07:00 |
|
Ishaan Jaff
|
d2857fc24c
|
Merge branch 'main' into litellm_redact_messages_slack_alerting
|
2024-06-07 12:43:53 -07:00 |
|
Krrish Dholakia
|
e66b3d264f
|
fix(factory.py): handle bedrock claude image url's
|
2024-06-07 10:04:03 -07:00 |
|
Ishaan Jaff
|
54ac848bfb
|
use litellm_parent_otel_span as litellm_param
|
2024-06-07 08:54:28 -07:00 |
|
Krish Dholakia
|
26993c067e
|
Merge branch 'main' into litellm_bedrock_converse_api
|
2024-06-07 08:49:52 -07:00 |
|
Krish Dholakia
|
f6a262122b
|
Merge pull request #4054 from BerriAI/litellm_aws_kms_support
feat(aws_secret_manager.py): Support AWS KMS for Master Key encrption
|
2024-06-07 08:49:25 -07:00 |
|
Krrish Dholakia
|
35e4323095
|
refactor(main.py): only route anthropic calls through converse api
v0 scope let's move function calling to converse api
|
2024-06-07 08:47:51 -07:00 |
|
Krish Dholakia
|
471be6670c
|
Merge pull request #4049 from BerriAI/litellm_cleanup_traceback
refactor: replace 'traceback.print_exc()' with logging library
|
2024-06-07 08:03:22 -07:00 |
|
Krish Dholakia
|
b6e0bf27b8
|
Merge branch 'main' into litellm_aws_kms_support
|
2024-06-07 07:58:56 -07:00 |
|
Krrish Dholakia
|
51ba5652a0
|
feat(bedrock_httpx.py): working bedrock converse api streaming
|
2024-06-06 22:13:21 -07:00 |
|
Krrish Dholakia
|
a995a0b172
|
fix(bedrock_httpx.py): working claude 3 function calling
|
2024-06-06 20:12:41 -07:00 |
|
Krrish Dholakia
|
6e9bca59b0
|
fix(utils.py): fix exception mapping for azure internal server error
|
2024-06-06 17:12:30 -07:00 |
|
Krish Dholakia
|
1742141fb6
|
Merge pull request #4046 from BerriAI/litellm_router_order
feat(router.py): enable settting 'order' for a deployment in model list
|
2024-06-06 16:37:03 -07:00 |
|
Krish Dholakia
|
677e0255c8
|
Merge branch 'main' into litellm_cleanup_traceback
|
2024-06-06 16:32:08 -07:00 |
|
Krrish Dholakia
|
a2da2a8f16
|
feat(aws_secret_manager.py): allows user to keep a hash of the proxy master key in their env
|
2024-06-06 15:32:51 -07:00 |
|
Krrish Dholakia
|
6cca5612d2
|
refactor: replace 'traceback.print_exc()' with logging library
allows error logs to be in json format for otel logging
|
2024-06-06 13:47:43 -07:00 |
|
Ishaan Jaff
|
1e8429bb20
|
feat - redact messages from slack alerting
|
2024-06-06 10:38:15 -07:00 |
|
Raymond1415926
|
38b44c301a
|
Merge branch 'BerriAI:main' into main
|
2024-06-06 10:12:20 -07:00 |
|
Krrish Dholakia
|
a7dcf25722
|
feat(router.py): enable settting 'order' for a deployment in model list
Allows user to control which model gets called first in model group
|
2024-06-06 09:46:51 -07:00 |
|
Raymond Huang
|
f4c49755a0
|
fix token counter bug
|
2024-06-05 23:40:55 -07:00 |
|
Sha Ahammed Roze
|
ea67803747
|
Merge branch 'BerriAI:main' into main
|
2024-06-06 10:02:15 +05:30 |
|
Krrish Dholakia
|
a76a9b7d11
|
feat(bedrock_httpx.py): add support for bedrock converse api
closes https://github.com/BerriAI/litellm/issues/4000
|
2024-06-05 21:20:36 -07:00 |
|
Ishaan Jaff
|
4d2337ec72
|
Merge branch 'main' into patch-1
|
2024-06-05 13:35:31 -07:00 |
|
Sha Ahammed Roze
|
0a4abfdd1d
|
Merge branch 'BerriAI:main' into main
|
2024-06-05 21:56:41 +05:30 |
|
Krrish Dholakia
|
b360ab4c89
|
fix(azure.py): support dynamic drop params
|
2024-06-05 09:03:10 -07:00 |
|
Krrish Dholakia
|
162f9400d2
|
feat(utils.py): support dynamically setting 'drop_params'
Allows user to turn this on/off for individual calls by passing in as a completion arg
|
2024-06-05 08:44:04 -07:00 |
|
sha-ahammed
|
faa4dfe03e
|
feat: Add Ollama as a provider in the proxy UI
|
2024-06-05 16:48:38 +05:30 |
|
Ishaan Jaff
|
6b57352400
|
fix VertexAIException APIError
|
2024-06-04 22:11:11 -07:00 |
|
Ishaan Jaff
|
d6ba50319c
|
fix langfuse log metadata
|
2024-06-04 21:34:49 -07:00 |
|
Krish Dholakia
|
c544ba3654
|
Merge pull request #4009 from BerriAI/litellm_fix_streaming_cost_cal
fix(utils.py): fix cost calculation for openai-compatible streaming object
|
2024-06-04 21:00:22 -07:00 |
|
Krish Dholakia
|
d6f4233441
|
Merge pull request #4015 from BerriAI/litellm_stream_options_fix_2
feat(utils.py): Support `stream_options` param across all providers
|
2024-06-04 20:59:39 -07:00 |
|
Krrish Dholakia
|
1a95660495
|
fix(test_completion.py): fix predibase test to be mock + fix optional param mapping for predibase
|
2024-06-04 20:06:23 -07:00 |
|
Krrish Dholakia
|
43af5575c8
|
fix(utils.py): fix
|
2024-06-04 19:41:20 -07:00 |
|
Krrish Dholakia
|
54dacfdf61
|
feat(utils.py): support 'stream_options' param across all providers
Closes https://github.com/BerriAI/litellm/issues/3553
|
2024-06-04 19:03:26 -07:00 |
|
Krrish Dholakia
|
34f31a1994
|
fix(utils.py): add coverage for text openai and databricks
|
2024-06-04 18:27:03 -07:00 |
|
Krrish Dholakia
|
9aa29854de
|
fix(utils.py): fix stream options to return consistent response object
|
2024-06-04 18:17:45 -07:00 |
|