Commit graph

1336 commits

Author SHA1 Message Date
Krrish Dholakia
8571cb45e8 fix(http_handler.py): add retry logic for httpx.ConnectError 2024-06-28 22:35:26 -07:00
Krrish Dholakia
d10912beeb fix(main.py): pass in openrouter as custom provider for openai client call
Fixes https://github.com/BerriAI/litellm/issues/4414
2024-06-28 21:26:42 -07:00
Ishaan Jaff
71890609ff fix gemini tool calling issue 2024-06-28 18:03:33 -07:00
Ishaan Jaff
1980a07f32 fix test custom callback router 2024-06-28 16:54:28 -07:00
Krrish Dholakia
c151a1d244 fix(http_handler.py): raise more detailed http status errors 2024-06-28 15:12:38 -07:00
Krrish Dholakia
a7122f91a1 fix(support-'alt=sse'-param): Fixes https://github.com/BerriAI/litellm/issues/4459 2024-06-28 10:40:47 -07:00
Tiger Yu
b0c1d235be Include vertex_ai_beta in vertex_ai param mapping 2024-06-28 10:36:58 -07:00
Krrish Dholakia
6b14cf7657 fix(vertex_httpx.py): only use credential project id, if user project id not given 2024-06-27 22:08:14 -07:00
Krish Dholakia
1223b2b111
Merge pull request #4449 from BerriAI/litellm_azure_tts
feat(azure.py): azure tts support
2024-06-27 21:33:38 -07:00
Krish Dholakia
6f030043e6
Merge pull request #4445 from ushuz/fix-google-creds
fix: do not resolve vertex project id from creds
2024-06-27 21:31:59 -07:00
Ishaan Jaff
7bedb5ca52
Merge pull request #4451 from BerriAI/litellm_azure_fix_pre_post_logging
[Fix] Azure Post-API Call occurs before Pre-API Call in CustomLogger
2024-06-27 20:18:14 -07:00
Krrish Dholakia
2faa6f704a fix(factory.py): get image type from response headers
Fixes https://github.com/BerriAI/litellm/issues/4441
2024-06-27 18:19:16 -07:00
Ishaan Jaff
57ba0a46b7 azure - log post api call 2024-06-27 17:38:03 -07:00
Krrish Dholakia
c14cc35e52 feat(azure.py): azure tts support
Closes https://github.com/BerriAI/litellm/issues/4002
2024-06-27 16:59:25 -07:00
John HU
933101a3f8
Do not resolve project id from creds 2024-06-27 15:26:36 -07:00
Ishaan Jaff
80d8bf5d8f fix raise better error message on reaching failed vertex import 2024-06-27 12:02:19 -07:00
Ishaan Jaff
4707c4fccd fix gemini test 2024-06-26 19:00:30 -07:00
Ishaan Jaff
d213f81b4c add initial support for volcengine 2024-06-26 16:53:44 -07:00
Ishaan Jaff
90892d4414
Merge pull request #4431 from BerriAI/litellm_use_openai_client_proxy
[Fix-Proxy] Azure Embeddings use AsyncAzureOpenAI Client initialized on litellm.Router for requests
2024-06-26 16:37:31 -07:00
Krrish Dholakia
151d19960e fix(bedrock_httpx.py): Fix https://github.com/BerriAI/litellm/issues/4415 2024-06-26 16:19:46 -07:00
Ishaan Jaff
b7bca0af6c fix - reuse client initialized on proxy config 2024-06-26 16:16:58 -07:00
Ishaan Jaff
ae431eb85d
Merge pull request #4418 from BerriAI/litellm_fireworks_ai_tool_calling
[Feat] Add Fireworks AI Tool calling support
2024-06-26 08:30:06 -07:00
Brian Schultheiss
eeedfceee4 Merge branch 'main' of https://github.com/BerriAI/litellm into litellm_ftr_bedrock_aws_session_token 2024-06-26 08:11:34 -07:00
Krrish Dholakia
09f4eb7617 docs(reliable_completions.md): improve headers for easier searching 2024-06-26 08:09:31 -07:00
Ishaan Jaff
8a7f2921f2 fix + test fireworks ai param mapping for tools 2024-06-26 06:57:04 -07:00
Ishaan Jaff
dcdf266f36 fix fireworks ai config 2024-06-26 06:40:44 -07:00
Ishaan Jaff
493a737787 feat - add fireworks ai config for param mapping 2024-06-26 06:29:21 -07:00
Krrish Dholakia
e813e984f7 fix(predibase.py): support json schema on predibase 2024-06-25 16:03:47 -07:00
Brian Schultheiss
5dce53579e Revert some non-essential changes 2024-06-25 14:09:55 -07:00
Brian Schultheiss
746f864fb2 Revert some non-essential changes 2024-06-25 13:57:12 -07:00
Krrish Dholakia
6889a4c0dd fix(utils.py): predibase exception mapping - map 424 as a badrequest error 2024-06-25 13:47:38 -07:00
Brian Schultheiss
80b4af7abe Revert some non-essential changes 2024-06-25 13:29:33 -07:00
Ishaan Jaff
07829514d1 feat - add param mapping for nvidia nim 2024-06-25 09:13:08 -07:00
Ishaan Jaff
abf919eef2 add nvidia nim to __init__ 2024-06-25 08:53:06 -07:00
Krish Dholakia
75c8f77d9a
Merge pull request #4373 from CorrM/main
[Fix-Improve] Improve Ollama prompt input and fix Ollama function calling key error
2024-06-25 06:27:43 -07:00
corrm
5b720137fd Improve ollama prompt: this formula give good result with AutoGen 2024-06-25 13:53:27 +03:00
corrm
b69a092f56 Rename ollama prompt:
- 'Function' word to 'FunctionName'
- 'Tool Call' to `FunctionCall`
- 'Tool Call Result' to 'FunctionCall Result'

_I found that changes make some models better_
2024-06-25 12:40:07 +03:00
Krish Dholakia
5d570e7c6c
Merge pull request #4392 from BerriAI/litellm_gemini_content_policy_errors
fix(vertex_httpx.py): cover gemini content violation (on prompt)
2024-06-24 20:00:06 -07:00
Krrish Dholakia
d182ea0f77 fix(utils.py): catch 422-status errors 2024-06-24 19:41:48 -07:00
Krrish Dholakia
8e6e5a6d37 fix(vertex_httpx.py): Return empty model response for content filter violations 2024-06-24 19:22:20 -07:00
Krrish Dholakia
1ff0129a94 fix(vertex_httpx.py): cover gemini content violation (on prompt) 2024-06-24 19:13:56 -07:00
Islam Nofl
e2af13550a
Rename ollama prompt 'Function' word to 'Name' 2024-06-24 08:01:15 +03:00
corrm
b8a8b0847c Added improved function name handling in ollama_async_streaming 2024-06-24 05:56:56 +03:00
corrm
423a60c8bc chore: Improved OllamaConfig get_required_params and ollama_acompletion and ollama_async_streaming functions 2024-06-24 05:55:22 +03:00
corrm
b2c1a3ad79 chore: Improved prompt generation in ollama_pt function 2024-06-24 05:54:58 +03:00
Brian Schultheiss
3fbb25f903 Updated more references to AWS session token 2024-06-23 13:37:38 -07:00
Brian Schultheiss
7f91e53548 updated documentation to reference boto3.client credential extraction, and update boto3.client creation to support session_token. 2024-06-23 13:15:04 -07:00
Brian Schultheiss
8ad1ae73e5 Support aws_session_token for bedrock client. https://github.com/BerriAI/litellm/issues/4346 2024-06-23 12:51:25 -07:00
Krrish Dholakia
cea630022e fix(add-exception-mapping-+-langfuse-exception-logging-for-streaming-exceptions): add exception mapping + langfuse exception logging for streaming exceptions
Fixes https://github.com/BerriAI/litellm/issues/4338
2024-06-22 21:26:15 -07:00
Krrish Dholakia
0fd9033502 fix(vertex_httpx.py): flush remaining chunks from stream 2024-06-22 20:33:54 -07:00