llama-stack-mirror/distributions
Vladimir Ivic b2630901c3 Fix incorrect ollama port in ollama run.yaml template
Summary:
Default port number was not correct ollama port number 11434.
2024-11-18 12:39:47 -08:00
..
bedrock Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
databricks fix broken --list-templates with adding build.yaml files for packaging (#327) 2024-10-25 12:51:22 -07:00
dell-tgi Update provider types and prefix with inline:: 2024-11-12 12:54:44 -08:00
fireworks Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
hf-endpoint fix broken --list-templates with adding build.yaml files for packaging (#327) 2024-10-25 12:51:22 -07:00
hf-serverless fix broken --list-templates with adding build.yaml files for packaging (#327) 2024-10-25 12:51:22 -07:00
inline-vllm Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
meta-reference-gpu Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
meta-reference-quantized-gpu Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
ollama Fix incorrect ollama port in ollama run.yaml template 2024-11-18 12:39:47 -08:00
ollama-gpu Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
remote-vllm add support for ${env.FOO_BAR} placeholders in run.yaml files (#439) 2024-11-13 11:25:58 -08:00
tgi Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
together Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00