Krrish Dholakia
dd2ea72cb4
fix(openai.py): fix position of invalid_params param
2024-08-10 09:52:27 -07:00
Ishaan Jaff
d445f3967d
correctly add modified tool names to cache
2024-08-09 17:39:49 -07:00
Ishaan Jaff
8b98193a51
bedrock make_valid_bedrock_tool_name
2024-08-09 13:16:38 -07:00
Krrish Dholakia
da0413f0a4
fix(factory.py): handle openai function message having tool call id
2024-08-08 17:06:50 -07:00
Krrish Dholakia
2ccb5a48b7
fix(bedrock_httpx.py): handle empty arguments returned during tool calling streaming
2024-08-07 09:54:50 -07:00
Krrish Dholakia
259e3b1245
fix(cohere_chat.py): handle tool_result + user message being passed in
2024-07-29 21:38:45 -07:00
Krrish Dholakia
528741a809
fix(factory.py): handle special keys for mistral chat template
2024-07-29 17:47:44 -07:00
Krrish Dholakia
e0f3cd580c
fix(factory.py): support mistral ai prefix:true in messages
...
Fixes https://github.com/BerriAI/litellm/issues/4882
2024-07-27 22:30:40 -07:00
Krrish Dholakia
757dedd4c8
fix(factory.py): refactor factory to use httpx client
2024-07-19 15:35:05 -07:00
Krrish Dholakia
4c4f032a75
fix(factory.py): refactor factory to use httpx client
2024-07-19 15:32:01 -07:00
Krrish Dholakia
30d6c1d171
fix(factory.py): handle escape characters in image url
...
prevents mimetype errors in vertex ai calls
2024-07-18 17:36:58 -07:00
Krrish Dholakia
73bfc299f7
fix(factory.py): check if empty text block passed in (anthropic_messages_pt)
2024-07-17 11:27:22 -07:00
Krrish Dholakia
bc9f5eb628
fix(factory.py): handle content not being set
2024-07-16 21:51:30 -07:00
Krrish Dholakia
9771edb302
fix(factory.py): use stronger typing for anthropic translation
...
Fixes https://github.com/BerriAI/litellm/issues/4738
2024-07-16 19:58:27 -07:00
Krrish Dholakia
155ee7e99c
fix(factory.py): allow converting pdf url to base64
2024-07-15 21:11:53 -07:00
Ishaan Jaff
03933de775
fix exception raised in factory.py
2024-07-13 09:54:32 -07:00
Krrish Dholakia
0decc36bed
fix(factory.py): handle message content being a list instead of string
...
Fixes https://github.com/BerriAI/litellm/issues/4679
2024-07-12 19:00:39 -07:00
Krrish Dholakia
54c31e5af7
fix(factory.py): check if assistant message content is not none before translating
2024-07-09 07:48:12 -07:00
Krrish Dholakia
f15cb2bdc8
fix(factory.py): flexible handling of image file type in b64 encode
...
Fixes https://github.com/BerriAI/litellm/issues/4589
2024-07-08 09:46:47 -07:00
Krrish Dholakia
fd25117b67
fix(main.py): fix azure ai cohere tool calling
2024-07-04 11:46:14 -07:00
Krrish Dholakia
a8e181369d
fix(cohere.py): fix message parsing to handle tool calling correctly
2024-07-04 11:13:07 -07:00
Krrish Dholakia
d4577fe476
fix(factory.py): fix linting
2024-07-04 08:21:19 -07:00
Krish Dholakia
3184640291
Merge pull request #4522 from andrewmjc/matchingToolMessageSpec
...
[Bug Fix] Use OpenAI Tool Response Spec When Converting To Gemini/VertexAI Tool Response
2024-07-04 08:03:38 -07:00
Krish Dholakia
5e47970eed
Merge branch 'main' into litellm_anthropic_tool_calling_streaming_fix
2024-07-03 20:43:51 -07:00
Krrish Dholakia
2e5a81f280
fix(utils.py): stream_options working across all providers
2024-07-03 20:40:46 -07:00
nick-rackauckas
ce595027ad
Fix LiteLlm Granite Prompt template
2024-07-03 17:52:28 -07:00
andrewmjc
95dc480204
unit test
2024-07-03 12:23:26 -06:00
andrewmjc
e07b110b47
matching openai tool result spec
2024-07-02 16:57:13 -06:00
Krrish Dholakia
05dfc63b88
feat(vertex_httpx.py): support the 'response_schema' param for older vertex ai models - pass as prompt (user-controlled)
...
if 'response_schema' is not supported for vertex model (e.g. gemini-1.5-flash) pass in prompt
2024-06-29 13:25:27 -07:00
Krrish Dholakia
2faa6f704a
fix(factory.py): get image type from response headers
...
Fixes https://github.com/BerriAI/litellm/issues/4441
2024-06-27 18:19:16 -07:00
Ishaan Jaff
abf919eef2
add nvidia nim to __init__
2024-06-25 08:53:06 -07:00
corrm
5b720137fd
Improve ollama prompt: this formula give good result with AutoGen
2024-06-25 13:53:27 +03:00
corrm
b69a092f56
Rename ollama prompt:
...
- 'Function' word to 'FunctionName'
- 'Tool Call' to `FunctionCall`
- 'Tool Call Result' to 'FunctionCall Result'
_I found that changes make some models better_
2024-06-25 12:40:07 +03:00
Islam Nofl
e2af13550a
Rename ollama prompt 'Function' word to 'Name'
2024-06-24 08:01:15 +03:00
corrm
b2c1a3ad79
chore: Improved prompt generation in ollama_pt function
2024-06-24 05:54:58 +03:00
Krrish Dholakia
3f7252c422
fix(support-passing-image-url-to-gemini-via-vertex-ai): Closes https://github.com/BerriAI/litellm/issues/4262
2024-06-18 10:55:58 -07:00
Krrish Dholakia
e66b3d264f
fix(factory.py): handle bedrock claude image url's
2024-06-07 10:04:03 -07:00
Krish Dholakia
26993c067e
Merge branch 'main' into litellm_bedrock_converse_api
2024-06-07 08:49:52 -07:00
Krrish Dholakia
a995a0b172
fix(bedrock_httpx.py): working claude 3 function calling
2024-06-06 20:12:41 -07:00
Rajan Paneru
2e25416c02
Table format fix and Typo
...
* Fixed marked down table formats because they were looking ugly in pycharm
* Typos Fixes.
2024-06-06 22:29:59 +09:30
Krrish Dholakia
a76a9b7d11
feat(bedrock_httpx.py): add support for bedrock converse api
...
closes https://github.com/BerriAI/litellm/issues/4000
2024-06-05 21:20:36 -07:00
Krrish Dholakia
e8df9c4041
fix(factory.py): fix linting error
2024-05-24 19:12:09 -07:00
Miguel Escriva
43dc97a331
fix(factory.py): Ollama vision fix.
...
Add convert_to_ollama_image function to handle image conversion.
2024-05-23 11:11:20 +02:00
Ishaan Jaff
622e241e18
Merge pull request #3711 from hmcp22/add_gemini_base64_image_support
...
Adding decoding of base64 image data for gemini pro 1.5
2024-05-20 13:34:39 -07:00
Krrish Dholakia
a2c66ed4fb
fix(vertex_ai.py): support passing in result of tool call to vertex
...
Fixes https://github.com/BerriAI/litellm/issues/3709
2024-05-19 11:34:07 -07:00
hugo.lima
32e25cdb11
added decoding of base64 image data in _gemini_vision_convert_messages function
2024-05-17 16:46:56 +01:00
Krish Dholakia
8f6ae9a059
Merge pull request #3369 from mogith-pn/main
...
Clarifai-LiteLLM : Added clarifai as LLM Provider.
2024-05-11 09:31:46 -07:00
Krish Dholakia
8015bc1c47
Revert "Add support for async streaming to watsonx provider "
2024-05-09 07:44:15 -07:00
Krish Dholakia
3f13251241
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
...
Add support for async streaming to watsonx provider
2024-05-08 22:19:05 -07:00
Ishaan Jaff
6d955ef457
Merge branch 'main' into ui_show_spend_end_user
2024-05-08 18:29:25 -07:00