llama-stack-mirror/distributions
Martin Hickey cf3f0b0a33 Add Ollama GPU run file
Fix formatting issue also.

Signed-off-by: Martin Hickey <martin.hickey@ie.ibm.com>
2024-11-15 15:50:27 +00:00
..
bedrock Fix conda env names in distribution example run template 2024-11-15 15:32:52 +00:00
databricks fix broken --list-templates with adding build.yaml files for packaging (#327) 2024-10-25 12:51:22 -07:00
dell-tgi Update provider types and prefix with inline:: 2024-11-12 12:54:44 -08:00
fireworks Fix conda env names in distribution example run template 2024-11-15 15:32:52 +00:00
hf-endpoint fix broken --list-templates with adding build.yaml files for packaging (#327) 2024-10-25 12:51:22 -07:00
hf-serverless fix broken --list-templates with adding build.yaml files for packaging (#327) 2024-10-25 12:51:22 -07:00
inline-vllm Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
meta-reference-gpu Fix conda env names in distribution example run template 2024-11-15 15:32:52 +00:00
meta-reference-quantized-gpu Add Ollama GPU run file 2024-11-15 15:50:27 +00:00
ollama Fix conda env names in distribution example run template 2024-11-15 15:32:52 +00:00
ollama-gpu Add Ollama GPU run file 2024-11-15 15:50:27 +00:00
remote-vllm add support for ${env.FOO_BAR} placeholders in run.yaml files (#439) 2024-11-13 11:25:58 -08:00
tgi Fix conda env names in distribution example run template 2024-11-15 15:32:52 +00:00
together Fix conda env names in distribution example run template 2024-11-15 15:32:52 +00:00