llama-stack-mirror/llama_stack/templates
Eric Huang f39d1732ea list responses
# What does this PR do?


## Test Plan
2025-05-23 13:00:59 -07:00
..
bedrock list responses 2025-05-23 13:00:59 -07:00
cerebras list responses 2025-05-23 13:00:59 -07:00
ci-tests list responses 2025-05-23 13:00:59 -07:00
dell list responses 2025-05-23 13:00:59 -07:00
experimental-post-training feat: add huggingface post_training impl (#2132) 2025-05-16 14:41:28 -07:00
fireworks list responses 2025-05-23 13:00:59 -07:00
groq list responses 2025-05-23 13:00:59 -07:00
hf-endpoint list responses 2025-05-23 13:00:59 -07:00
hf-serverless list responses 2025-05-23 13:00:59 -07:00
llama_api list responses 2025-05-23 13:00:59 -07:00
meta-reference-gpu list responses 2025-05-23 13:00:59 -07:00
nvidia list responses 2025-05-23 13:00:59 -07:00
ollama list responses 2025-05-23 13:00:59 -07:00
open-benchmark list responses 2025-05-23 13:00:59 -07:00
passthrough list responses 2025-05-23 13:00:59 -07:00
remote-vllm list responses 2025-05-23 13:00:59 -07:00
sambanova list responses 2025-05-23 13:00:59 -07:00
starter list responses 2025-05-23 13:00:59 -07:00
tgi list responses 2025-05-23 13:00:59 -07:00
together list responses 2025-05-23 13:00:59 -07:00
verification list responses 2025-05-23 13:00:59 -07:00
vllm-gpu list responses 2025-05-23 13:00:59 -07:00
watsonx list responses 2025-05-23 13:00:59 -07:00
__init__.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
dependencies.json feat: implement get chat completions APIs (#2200) 2025-05-21 22:21:52 -07:00
template.py feat: implement get chat completions APIs (#2200) 2025-05-21 22:21:52 -07:00