Krrish Dholakia
|
e66b3d264f
|
fix(factory.py): handle bedrock claude image url's
|
2024-06-07 10:04:03 -07:00 |
|
Krish Dholakia
|
26993c067e
|
Merge branch 'main' into litellm_bedrock_converse_api
|
2024-06-07 08:49:52 -07:00 |
|
Krrish Dholakia
|
a995a0b172
|
fix(bedrock_httpx.py): working claude 3 function calling
|
2024-06-06 20:12:41 -07:00 |
|
Rajan Paneru
|
2e25416c02
|
Table format fix and Typo
* Fixed marked down table formats because they were looking ugly in pycharm
* Typos Fixes.
|
2024-06-06 22:29:59 +09:30 |
|
Krrish Dholakia
|
a76a9b7d11
|
feat(bedrock_httpx.py): add support for bedrock converse api
closes https://github.com/BerriAI/litellm/issues/4000
|
2024-06-05 21:20:36 -07:00 |
|
Krrish Dholakia
|
e8df9c4041
|
fix(factory.py): fix linting error
|
2024-05-24 19:12:09 -07:00 |
|
Miguel Escriva
|
43dc97a331
|
fix(factory.py): Ollama vision fix.
Add convert_to_ollama_image function to handle image conversion.
|
2024-05-23 11:11:20 +02:00 |
|
Ishaan Jaff
|
622e241e18
|
Merge pull request #3711 from hmcp22/add_gemini_base64_image_support
Adding decoding of base64 image data for gemini pro 1.5
|
2024-05-20 13:34:39 -07:00 |
|
Krrish Dholakia
|
a2c66ed4fb
|
fix(vertex_ai.py): support passing in result of tool call to vertex
Fixes https://github.com/BerriAI/litellm/issues/3709
|
2024-05-19 11:34:07 -07:00 |
|
hugo.lima
|
32e25cdb11
|
added decoding of base64 image data in _gemini_vision_convert_messages function
|
2024-05-17 16:46:56 +01:00 |
|
Krish Dholakia
|
8f6ae9a059
|
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
|
2024-05-11 09:31:46 -07:00 |
|
Krish Dholakia
|
8015bc1c47
|
Revert "Add support for async streaming to watsonx provider "
|
2024-05-09 07:44:15 -07:00 |
|
Krish Dholakia
|
3f13251241
|
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
|
2024-05-08 22:19:05 -07:00 |
|
Ishaan Jaff
|
6d955ef457
|
Merge branch 'main' into ui_show_spend_end_user
|
2024-05-08 18:29:25 -07:00 |
|
Ishaan Jaff
|
faab704d28
|
update global/spend/end_users
|
2024-05-08 17:03:09 -07:00 |
|
Krish Dholakia
|
303e0c6226
|
Revert "* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role"
|
2024-05-07 21:42:18 -07:00 |
|
nkvch
|
06d1aec49e
|
* fix(factory.py): fix conditional statement for merging messages with previous ones
|
2024-05-07 13:42:42 +02:00 |
|
nkvch
|
7d7b59ff78
|
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
|
2024-05-07 12:51:30 +02:00 |
|
Simon Sanchez Viloria
|
6181d1eaad
|
Merge branch 'main' into feature/watsonx-integration
|
2024-05-06 17:27:14 +02:00 |
|
Simon Sanchez Viloria
|
62b3f25398
|
(fix) fix ibm_granite_pt and use hf_chat_template for llama-3 instruct prompts
|
2024-05-06 17:08:40 +02:00 |
|
Krrish Dholakia
|
8d49b3a84c
|
fix(factory.py): support openai 'functions' messages
|
2024-05-04 12:33:39 -07:00 |
|
Krrish Dholakia
|
d9d5149aa1
|
fix(factory.py): support mapping openai 'tool' message to anthropic format
|
2024-05-04 10:14:52 -07:00 |
|
Krrish Dholakia
|
33472bfd2b
|
fix(factory.py): support 'function' openai message role for anthropic
Fixes https://github.com/BerriAI/litellm/issues/3446
|
2024-05-04 10:03:30 -07:00 |
|
Krrish Dholakia
|
8249c986bf
|
fix(main.py): support new 'supports_system_message=False' param
Fixes https://github.com/BerriAI/litellm/issues/3325
|
2024-05-03 21:31:45 -07:00 |
|
mogith-pn
|
d770df2259
|
Merge branch 'main' into main
|
2024-04-30 22:48:52 +05:30 |
|
mogith-pn
|
318b4813f2
|
Clarifai-LiteLLM integration (#1)
* intg v1 clarifai-litellm
* Added more community models and testcase
* Clarifai-updated markdown docs
|
2024-04-30 22:38:33 +05:30 |
|
alisalim17
|
0db7fa3fd8
|
fix: cohere tool results
|
2024-04-29 14:20:24 +04:00 |
|
Krish Dholakia
|
2a006c3d39
|
Revert "Fix Anthropic Messages Prompt Template function to add a third condition: list of text-content dictionaries"
|
2024-04-27 08:57:18 -07:00 |
|
Krish Dholakia
|
2d976cfabc
|
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
|
2024-04-27 05:48:34 -07:00 |
|
Emir Ayar
|
38b5f34c77
|
Merge branch 'BerriAI:main' into fix-anthropic-messages-api
|
2024-04-27 11:50:04 +02:00 |
|
Krrish Dholakia
|
ccb864bbea
|
fix(factory.py): correctly apply bos token for llama3 instruct prompt template
|
2024-04-25 17:37:17 -07:00 |
|
aswny
|
781af56f48
|
check model type chat/instruct to apply template
|
2024-04-25 17:52:38 +00:00 |
|
aswny
|
9ead717531
|
fix Llama models message to prompt conversion in for AWS Bedrock provider
|
2024-04-25 17:19:55 +00:00 |
|
Krrish Dholakia
|
4f46b4c397
|
fix(factory.py): add replicate meta llama prompt templating support
|
2024-04-25 08:25:00 -07:00 |
|
Krrish Dholakia
|
df7db2b870
|
fix(factory.py): support llama3 instuct chat template
allows automatic templating for llama3 instruct requests
|
2024-04-24 20:35:10 -07:00 |
|
Krish Dholakia
|
8d2e411df6
|
Merge pull request #3124 from elisalimli/bugfix/add-missing-tool-calls-mistral-messages
Add missing tool_calls and name to messages
|
2024-04-23 17:25:12 -07:00 |
|
Simon Sanchez Viloria
|
d72b725273
|
Fixed bugs in prompt factory for ibm-mistral and llama 3 models.
|
2024-04-23 16:20:49 +02:00 |
|
Simon S. Viloria
|
a77537ddd4
|
Merge branch 'BerriAI:main' into feature/watsonx-integration
|
2024-04-21 10:35:51 +02:00 |
|
Krish Dholakia
|
9f91f9919f
|
Merge pull request #3115 from elisalimli/litellm_fix/missing-spaces-between-prompts-cohere-chat
Fix new line issue in cohere_message_pt
|
2024-04-20 13:15:20 -07:00 |
|
Krish Dholakia
|
3faada32d5
|
Merge pull request #3118 from n1lanjan/2833-anthropic-tool-call-fix
Fix tool call errors using anthropic
|
2024-04-20 13:14:29 -07:00 |
|
Simon Sanchez Viloria
|
6edb133733
|
Added support for IBM watsonx.ai models
|
2024-04-20 20:06:46 +02:00 |
|
Ishaan Jaff
|
200f58c1e7
|
(fix) missing PIL import
|
2024-04-20 08:27:00 -07:00 |
|
Krrish Dholakia
|
cf1cc71b87
|
fix(factory.py): add mistral config to known tokenizer configs
|
2024-04-18 17:11:52 -07:00 |
|
Nilanjan De
|
ca3d2fea56
|
fix for #2904, remove XML characters in content
|
2024-04-18 22:42:32 +04:00 |
|
Nilanjan De
|
ecfae6d465
|
update factory.py
|
2024-04-18 22:31:08 +04:00 |
|
Nilanjan De
|
e1fd463f8c
|
update factory.py
|
2024-04-18 22:27:11 +04:00 |
|
alisalim17
|
a3f90ab735
|
fix: add missing tool_calls and function name to messages in mistral_api_pt
|
2024-04-18 20:49:01 +04:00 |
|
Nilanjan De
|
c85018c780
|
update factory.py
|
2024-04-18 18:13:20 +04:00 |
|
Nilanjan De
|
4c7d94b2b4
|
update factory.py
|
2024-04-18 17:35:52 +04:00 |
|
Nilanjan De
|
7ca213e92f
|
update factory.py
|
2024-04-18 15:12:31 +04:00 |
|