Add missing environments field for vLLM provider (#623)

@ashwinb sorry I missed this earlier in
https://github.com/meta-llama/llama-stack/pull/604.

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
This commit is contained in:
Yuan Tang 2024-12-13 17:06:27 -05:00 committed by GitHub
parent 516e1a3e59
commit 5764a95912
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -90,7 +90,7 @@ Additionally, we have designed every element of the Stack such that APIs as well
| Chroma | Single Node | | | :heavy_check_mark: | | | | Chroma | Single Node | | | :heavy_check_mark: | | |
| PG Vector | Single Node | | | :heavy_check_mark: | | | | PG Vector | Single Node | | | :heavy_check_mark: | | |
| PyTorch ExecuTorch | On-device iOS | :heavy_check_mark: | :heavy_check_mark: | | | | | PyTorch ExecuTorch | On-device iOS | :heavy_check_mark: | :heavy_check_mark: | | | |
| [vLLM](https://github.com/vllm-project/vllm) | | | :heavy_check_mark: | | | | | [vLLM](https://github.com/vllm-project/vllm) | Hosted and Single Node | | :heavy_check_mark: | | | |
### Distributions ### Distributions