..
ai21 /chat
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
aiohttp_openai /chat
[Bug]: Deepseek error on proxy after upgrading to 1.61.13-stable ( #8860 )
2025-02-26 21:11:06 -08:00
anthropic
(Refactor) /v1/messages
to follow simpler logic for Anthropic API spec ( #9013 )
2025-03-06 00:43:08 -08:00
azure
Merge branch 'main' into litellm_dev_03_10_2025_p3
2025-03-12 14:56:01 -07:00
azure_ai
fix(azure_ai/): fix transformation to handle when models don't support tool_choice
2025-03-11 21:42:24 -07:00
base_llm
fix code quality checks
2025-03-12 11:47:10 -07:00
bedrock
Merge pull request #9166 from BerriAI/litellm_dev_03_11_2025_p2
2025-03-11 22:51:20 -07:00
cerebras
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
clarifai
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
cloudflare /chat
[Bug]: Deepseek error on proxy after upgrading to 1.61.13-stable ( #8860 )
2025-02-26 21:11:06 -08:00
codestral /completion
Bug fix - String data: stripped from entire content in streamed Gemini responses ( #9070 )
2025-03-07 21:06:39 -08:00
cohere
Add cohere v2/rerank support ( #8421 ) ( #8605 )
2025-02-22 22:25:29 -08:00
custom_httpx
fix code quality checks
2025-03-12 11:47:10 -07:00
databricks
Bug fix - String data: stripped from entire content in streamed Gemini responses ( #9070 )
2025-03-07 21:06:39 -08:00
deepgram
Litellm dev 01 02 2025 p2 ( #7512 )
2025-01-02 21:57:51 -08:00
deepinfra /chat
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
deepseek
[Bug]: Deepseek error on proxy after upgrading to 1.61.13-stable ( #8860 )
2025-02-26 21:11:06 -08:00
deprecated_providers
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
empower /chat
LiteLLM Common Base LLM Config (pt.3): Move all OAI compatible providers to base llm config ( #7148 )
2024-12-10 17:12:42 -08:00
fireworks_ai
Fix bedrock passing response_format: {"type": "text"}
( #8900 )
2025-02-28 20:09:59 -08:00
friendliai /chat
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
galadriel /chat
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
gemini
Support format
param for specifying image type ( #9019 )
2025-03-05 19:52:53 -08:00
github /chat
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
groq
fix(groq/chat/transformation.py): fix groq response_format transformation ( #7565 )
2025-01-04 19:39:04 -08:00
hosted_vllm
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
huggingface
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
infinity /rerank
add bedrock llama vision support + cohere / infinity rerank - 'return_documents' support ( #8684 )
2025-02-20 21:23:54 -08:00
jina_ai
Add cohere v2/rerank support ( #8421 ) ( #8605 )
2025-02-22 22:25:29 -08:00
litellm_proxy /chat
[BETA] Add OpenAI /images/variations
+ Topaz API support ( #7700 )
2025-01-11 23:27:46 -08:00
lm_studio
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
mistral
_handle_tool_call_message linting
2025-01-16 22:34:16 -08:00
nlp_cloud
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
nvidia_nim
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
ollama
[Bug]: Deepseek error on proxy after upgrading to 1.61.13-stable ( #8860 )
2025-02-26 21:11:06 -08:00
oobabooga
Litellm dev 12 30 2024 p2 ( #7495 )
2025-01-01 18:57:29 -08:00
openai
Merge branch 'main' into litellm_dev_03_10_2025_p3
2025-03-12 14:56:01 -07:00
openai_like
fix: propagating json_mode to acompletion ( #8133 )
2025-01-30 21:17:26 -08:00
openrouter
Support openrouter reasoning_content
on streaming ( #9094 )
2025-03-09 20:03:59 -07:00
perplexity /chat
fix missing comma
2025-02-24 01:00:07 +05:30
petals
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
predibase
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
replicate
[Bug]: Deepseek error on proxy after upgrading to 1.61.13-stable ( #8860 )
2025-02-26 21:11:06 -08:00
sagemaker
Bug fix - String data: stripped from entire content in streamed Gemini responses ( #9070 )
2025-03-07 21:06:39 -08:00
sambanova
update sambanova docs ( #8875 )
2025-02-27 20:23:33 -08:00
together_ai
add bedrock llama vision support + cohere / infinity rerank - 'return_documents' support ( #8684 )
2025-02-20 21:23:54 -08:00
topaz
(Feat) - Add /bedrock/invoke
support for all Anthropic models ( #8383 )
2025-02-07 22:41:11 -08:00
triton
fix: fix linting error
2025-03-10 21:51:55 -07:00
vertex_ai
Bug fix - String data: stripped from entire content in streamed Gemini responses ( #9070 )
2025-03-07 21:06:39 -08:00
vllm /completion
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
voyage /embedding
Litellm dev 12 30 2024 p2 ( #7495 )
2025-01-01 18:57:29 -08:00
watsonx
[Bug]: Deepseek error on proxy after upgrading to 1.61.13-stable ( #8860 )
2025-02-26 21:11:06 -08:00
xai /chat
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
__init__.py
add linting
2023-08-18 11:05:05 -07:00
base.py
Complete 'requests' library removal ( #7350 )
2024-12-22 07:21:25 -08:00
baseten.py
test(base_llm_unit_tests.py): add test to ensure drop params is respe… ( #8224 )
2025-02-03 16:04:44 -08:00
custom_llm.py
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
maritalk.py
fix: dictionary changed size during iteration error ( #8327 ) ( #8341 )
2025-02-07 16:20:28 -08:00
ollama_chat.py
Litellm dev 03 08 2025 p3 ( #9089 )
2025-03-09 18:20:56 -07:00
README.md
LiteLLM Minor Fixes and Improvements (09/13/2024) ( #5689 )
2024-09-14 10:02:55 -07:00
volcengine.py
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00