llama-stack-mirror/llama_stack/templates
2024-11-18 23:51:35 -08:00
..
bedrock Update provider types and prefix with inline:: 2024-11-12 12:54:44 -08:00
databricks Split safety into (llama-guard, prompt-guard, code-scanner) (#400) 2024-11-11 09:29:18 -08:00
fireworks Fix docs yet again 2024-11-18 23:51:35 -08:00
hf-endpoint Update provider types and prefix with inline:: 2024-11-12 12:54:44 -08:00
hf-serverless Update provider types and prefix with inline:: 2024-11-12 12:54:44 -08:00
inline-vllm Update provider types and prefix with inline:: 2024-11-12 12:54:44 -08:00
meta-reference-gpu Add conda_env 2024-11-18 16:08:14 -08:00
meta-reference-quantized-gpu Update provider types and prefix with inline:: 2024-11-12 12:54:44 -08:00
ollama Update docs 2024-11-18 23:21:25 -08:00
remote-vllm Fix docs yet again 2024-11-18 23:51:35 -08:00
tgi Fix docs yet again 2024-11-18 23:51:35 -08:00
together Fix docs yet again 2024-11-18 23:51:35 -08:00
__init__.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
template.py Add conda_env 2024-11-18 16:08:14 -08:00