This PR adds vLLM inference provider for OpenAI compatible vLLM server.
* wip * config templates * readmes