llama-stack-mirror/llama_stack/providers
Matthew Farrellee ae74b31ae3
chore: remove vLLM inference adapter's custom list_models (#3703)
# What does this PR do?

remove vLLM inference adapter's custom list_models impl, rely on
standard impl instead

## Test Plan

ci
2025-10-06 13:27:30 -04:00
..
inline feat(api): add extra_body parameter support with shields example (#3670) 2025-10-03 13:25:09 -07:00
registry chore: turn OpenAIMixin into a pydantic.BaseModel (#3671) 2025-10-06 11:33:19 -04:00
remote chore: remove vLLM inference adapter's custom list_models (#3703) 2025-10-06 13:27:30 -04:00
utils chore: turn OpenAIMixin into a pydantic.BaseModel (#3671) 2025-10-06 11:33:19 -04:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: combine ProviderSpec datatypes (#3378) 2025-09-18 16:10:00 +02:00