llama-stack-mirror/llama_stack/providers/adapters/inference/vllm
Yuan Tang 74e6356b51 Add vLLM inference provider for OpenAI compatible vLLM server (#178)
This PR adds vLLM inference provider for OpenAI compatible vLLM server.
2024-10-21 10:46:45 -07:00
..
__init__.py Add vLLM inference provider for OpenAI compatible vLLM server (#178) 2024-10-21 10:46:45 -07:00
config.py Add vLLM inference provider for OpenAI compatible vLLM server (#178) 2024-10-21 10:46:45 -07:00
vllm.py Add vLLM inference provider for OpenAI compatible vLLM server (#178) 2024-10-21 10:46:45 -07:00