From 5764a95912051c8fa8a2db2a29ead21e2e25ba94 Mon Sep 17 00:00:00 2001 From: Yuan Tang Date: Fri, 13 Dec 2024 17:06:27 -0500 Subject: [PATCH] Add missing environments field for vLLM provider (#623) @ashwinb sorry I missed this earlier in https://github.com/meta-llama/llama-stack/pull/604. Signed-off-by: Yuan Tang --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 98ee0b5ad..dadafae90 100644 --- a/README.md +++ b/README.md @@ -90,7 +90,7 @@ Additionally, we have designed every element of the Stack such that APIs as well | Chroma | Single Node | | | :heavy_check_mark: | | | | PG Vector | Single Node | | | :heavy_check_mark: | | | | PyTorch ExecuTorch | On-device iOS | :heavy_check_mark: | :heavy_check_mark: | | | | -| [vLLM](https://github.com/vllm-project/vllm) | | | :heavy_check_mark: | | | | +| [vLLM](https://github.com/vllm-project/vllm) | Hosted and Single Node | | :heavy_check_mark: | | | | ### Distributions