Krish Dholakia
9b0f871129
Add /vllm/*
and /mistral/*
passthrough endpoints (adds support for Mistral OCR via passthrough)
...
* feat(llm_passthrough_endpoints.py): support mistral passthrough
Closes https://github.com/BerriAI/litellm/issues/9051
* feat(llm_passthrough_endpoints.py): initial commit for adding vllm passthrough route
* feat(vllm/common_utils.py): add new vllm model info route
make it possible to use vllm passthrough route via factory function
* fix(llm_passthrough_endpoints.py): add all methods to vllm passthrough route
* fix: fix linting error
* fix: fix linting error
* fix: fix ruff check
* fix(proxy/_types.py): add new passthrough routes
* docs(config_settings.md): add mistral env vars to docs
2025-04-14 22:06:33 -07:00
Ishaan Jaff
c7f14e936a
(code quality) run ruff rule to ban unused imports ( #7313 )
...
* remove unused imports
* fix AmazonConverseConfig
* fix test
* fix import
* ruff check fixes
* test fixes
* fix testing
* fix imports
2024-12-19 12:33:42 -08:00
Krish Dholakia
b82add11ba
LITELLM: Remove requests
library usage ( #7235 )
...
* fix(generic_api_callback.py): remove requests lib usage
* fix(budget_manager.py): remove requests lib usgae
* fix(main.py): cleanup requests lib usage
* fix(utils.py): remove requests lib usage
* fix(argilla.py): fix argilla test
* fix(athina.py): replace 'requests' lib usage with litellm module
* fix(greenscale.py): replace 'requests' lib usage with httpx
* fix: remove unused 'requests' lib import + replace usage in some places
* fix(prompt_layer.py): remove 'requests' lib usage from prompt layer
* fix(ollama_chat.py): remove 'requests' lib usage
* fix(baseten.py): replace 'requests' lib usage
* fix(codestral/): replace 'requests' lib usage
* fix(predibase/): replace 'requests' lib usage
* refactor: cleanup unused 'requests' lib imports
* fix(oobabooga.py): cleanup 'requests' lib usage
* fix(invoke_handler.py): remove unused 'requests' lib usage
* refactor: cleanup unused 'requests' lib import
* fix: fix linting errors
* refactor(ollama/): move ollama to using base llm http handler
removes 'requests' lib dep for ollama integration
* fix(ollama_chat.py): fix linting errors
* fix(ollama/completion/transformation.py): convert non-jpeg/png image to jpeg/png before passing to ollama
2024-12-17 12:50:04 -08:00
Ivan Vykopal
553453daa9
Fix vllm import ( #7224 )
...
* fix: Fix vllm import
* Update handler.py
2024-12-14 15:57:49 -08:00
Ishaan Jaff
b5d55688e5
(Refactor) Code Quality improvement - remove /prompt_templates/
, base_aws_llm.py
from /llms
folder ( #7164 )
...
* fix move base_aws_llm
* fix import
* update enforce llms folder style
* move prompt_templates
* update prompt_templates location
* fix imports
* fix imports
* fix imports
* fix imports
* fix checks
2024-12-11 00:02:46 -08:00
Krish Dholakia
d5aae81c6d
Litellm vllm refactor ( #7158 )
...
* refactor(vllm/): move vllm to use base llm config
* test: mark flaky test
2024-12-10 21:48:35 -08:00