llama-stack-mirror/llama_stack/providers/remote/inference/vllm
Matthew Farrellee ae74b31ae3
chore: remove vLLM inference adapter's custom list_models (#3703)
# What does this PR do?

remove vLLM inference adapter's custom list_models impl, rely on
standard impl instead

## Test Plan

ci
2025-10-06 13:27:30 -04:00
..
__init__.py chore: turn OpenAIMixin into a pydantic.BaseModel (#3671) 2025-10-06 11:33:19 -04:00
config.py chore: use remoteinferenceproviderconfig for remote inference providers (#3668) 2025-10-03 08:48:42 -07:00
vllm.py chore: remove vLLM inference adapter's custom list_models (#3703) 2025-10-06 13:27:30 -04:00