Rajan Paneru
65b07bcb8c
Preserving the Pydantic Message Object
...
Following statement replaces the Pydantic Message Object and initialize it with the dict
model_response["choices"][0]["message"] = response_json["message"]
We need to make sure message is always litellm.Message object
As a fix, based on the code of ollama.py file, i am updating just the content intead of entire object for both sync and async functions
2024-05-10 22:12:32 +09:30
Krish Dholakia
8ad979cdfe
Merge branch 'main' into litellm_region_based_routing
2024-05-08 22:19:51 -07:00
Krish Dholakia
3f13251241
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
...
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Krrish Dholakia
3d18897d69
feat(router.py): enable filtering model group by 'allowed_model_region'
2024-05-08 22:10:17 -07:00
Ishaan Jaff
6d955ef457
Merge branch 'main' into ui_show_spend_end_user
2024-05-08 18:29:25 -07:00
Ishaan Jaff
faab704d28
update global/spend/end_users
2024-05-08 17:03:09 -07:00
Krrish Dholakia
6575143460
feat(proxy_server.py): return litellm version in response headers
2024-05-08 16:00:08 -07:00
Krish Dholakia
303e0c6226
Revert "* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role"
2024-05-07 21:42:18 -07:00
Krish Dholakia
a325bf2fb8
Merge pull request #3478 from nkvch/Issue-#3474-anthropic-roles-alternation-issue
...
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
2024-05-07 21:24:47 -07:00
Ishaan Jaff
2725a55e7a
Merge pull request #3470 from mbektas/fix-ollama-embeddings
...
support sync ollama embeddings
2024-05-07 19:21:37 -07:00
nkvch
06d1aec49e
* fix(factory.py): fix conditional statement for merging messages with previous ones
2024-05-07 13:42:42 +02:00
nkvch
7d7b59ff78
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
2024-05-07 12:51:30 +02:00
Krrish Dholakia
c6d830d377
fix(anthropic.py): remove raise error on 'empty content'
...
Fixes https://github.com/BerriAI/litellm/issues/3453
2024-05-06 11:42:09 -07:00
Simon Sanchez Viloria
6181d1eaad
Merge branch 'main' into feature/watsonx-integration
2024-05-06 17:27:14 +02:00
Simon Sanchez Viloria
83a274b54b
(feat) support for async stream to watsonx provider
2024-05-06 17:08:40 +02:00
Simon Sanchez Viloria
62b3f25398
(fix) fix ibm_granite_pt and use hf_chat_template for llama-3 instruct prompts
2024-05-06 17:08:40 +02:00
Krish Dholakia
9f58583888
Merge pull request #3299 from themrzmaster/main
...
Allowing extra headers for bedrock
2024-05-06 07:45:53 -07:00
Krrish Dholakia
b5f3f198f2
fix(utils.py): anthropic error handling
2024-05-06 07:25:12 -07:00
Lucca Zenóbio
b22517845e
Merge branch 'main' into main
2024-05-06 09:40:23 -03:00
Mehmet Bektas
3acad270e5
support sync ollama embeddings
2024-05-05 19:44:25 -07:00
Jack Collins
bb6132eee1
Fix: get format from data not optional_params ollama non-stream completion
2024-05-05 18:59:26 -07:00
Jack Collins
81b1c46c6f
Add missing import itertools.chain
2024-05-05 18:54:08 -07:00
Jack Collins
03b82b78c1
Fix: Set finish_reason to tool_calls for non-stream responses in ollama
2024-05-05 18:52:31 -07:00
Jack Collins
297543e3e5
Parse streamed function calls as single delta in ollama
2024-05-05 18:52:20 -07:00
Jack Collins
dffe616267
Make newline same in async function
2024-05-05 18:51:53 -07:00
Jack Collins
c217a07d5e
Fix: Set finish_reason to tool_calls for non-stream responses
2024-05-05 18:47:58 -07:00
Jack Collins
107a77368f
Parse streamed function calls as single delta
2024-05-05 18:47:16 -07:00
Krish Dholakia
6be20f5fc6
Merge pull request #3455 from BerriAI/litellm_assistants_support
...
feat(openai.py): add support for openai assistants
2024-05-04 22:21:44 -07:00
Krrish Dholakia
1195bf296b
fix(openai.py): fix typing import for python 3.8
2024-05-04 21:49:30 -07:00
Krrish Dholakia
f2bf6411d8
fix(openai.py): fix linting error
2024-05-04 21:48:42 -07:00
Krrish Dholakia
8fe6c9b401
feat(assistants/main.py): support litellm.get_assistants()
and litellm.get_messages()
2024-05-04 21:30:28 -07:00
Krrish Dholakia
cad01fb586
feat(assistants/main.py): support 'litellm.get_threads'
2024-05-04 21:14:03 -07:00
Krrish Dholakia
b7796c7487
feat(assistants/main.py): add 'add_message' endpoint
2024-05-04 19:56:11 -07:00
Krrish Dholakia
681a95e37b
fix(assistants/main.py): support litellm.create_thread()
call
2024-05-04 19:35:37 -07:00
Krrish Dholakia
84c31a5528
feat(openai.py): add support for openai assistants
...
v0 commit. Closes https://github.com/BerriAI/litellm/issues/2842
2024-05-04 17:27:48 -07:00
Krrish Dholakia
09d7121af2
fix(bedrock.py): map finish reason for bedrock
2024-05-04 12:45:40 -07:00
Krrish Dholakia
8d49b3a84c
fix(factory.py): support openai 'functions' messages
2024-05-04 12:33:39 -07:00
Krrish Dholakia
d9d5149aa1
fix(factory.py): support mapping openai 'tool' message to anthropic format
2024-05-04 10:14:52 -07:00
Krrish Dholakia
33472bfd2b
fix(factory.py): support 'function' openai message role for anthropic
...
Fixes https://github.com/BerriAI/litellm/issues/3446
2024-05-04 10:03:30 -07:00
Krrish Dholakia
8249c986bf
fix(main.py): support new 'supports_system_message=False' param
...
Fixes https://github.com/BerriAI/litellm/issues/3325
2024-05-03 21:31:45 -07:00
Krish Dholakia
7d2aa2f645
Merge pull request #3433 from BerriAI/litellm_timeout_fix
...
fix(bedrock.py): convert httpx.timeout to boto3 valid timeout
2024-05-03 18:53:59 -07:00
Krrish Dholakia
097714e02f
fix(anthropic.py): handle whitespace characters for anthropic calls
2024-05-03 17:31:34 -07:00
Krrish Dholakia
0450abfdc1
fix(bedrock.py): fix boto3 config init
2024-05-03 17:22:06 -07:00
Krrish Dholakia
0b9fa53e3e
fix(anthropic.py): drop unsupported non-whitespace character value when calling anthropic with stop sequences
...
Fixes https://github.com/BerriAI/litellm/issues/3286
2024-05-03 16:59:49 -07:00
Krrish Dholakia
a732d8772a
fix(bedrock.py): convert httpx.timeout to boto3 valid timeout
...
Closes https://github.com/BerriAI/litellm/issues/3398
2024-05-03 16:24:21 -07:00
Lucca Zenóbio
78303b79ee
Merge branch 'main' into main
2024-05-02 09:46:34 -03:00
Krrish Dholakia
29aefaa7d7
fix(utils.py): fix default values
2024-05-01 22:24:37 -07:00
Krish Dholakia
0714eb3526
Merge branch 'main' into litellm_ollama_tool_call_reponse
2024-05-01 10:24:05 -07:00
alisalim17
0db7fa3fd8
fix: cohere tool results
2024-04-29 14:20:24 +04:00
Krish Dholakia
1841b74f49
Merge branch 'main' into litellm_common_auth_params
2024-04-28 08:38:06 -07:00