llama-stack-mirror/llama_stack/providers/impls/vllm
Ashwin Bharambe e45a417543 more fixes, plug shutdown handlers
still, FastAPIs sigint handler is not calling ours
2024-10-08 17:23:02 -07:00
..
__init__.py Inline vLLM inference provider (#181) 2024-10-05 23:34:16 -07:00
config.py Inline vLLM inference provider (#181) 2024-10-05 23:34:16 -07:00
vllm.py more fixes, plug shutdown handlers 2024-10-08 17:23:02 -07:00