llama-stack/llama_stack
Ashwin Bharambe b10e9f46bb
Enable remote::vllm (#384)
* Enable remote::vllm

* Kill the giant list of hard coded models
2024-11-06 14:42:44 -08:00
..
apis add bedrock distribution code (#358) 2024-11-06 14:39:11 -08:00
cli Kill llama stack configure (#371) 2024-11-06 13:32:10 -08:00
distribution add bedrock distribution code (#358) 2024-11-06 14:39:11 -08:00
providers Enable remote::vllm (#384) 2024-11-06 14:42:44 -08:00
scripts Add a test for CLI, but not fully done so disabled 2024-09-19 13:27:07 -07:00
templates Kill llama stack configure (#371) 2024-11-06 13:32:10 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00