llama-stack-mirror/llama_stack/core/ui
Ashwin Bharambe cc87995e2b
chore: rename templates to distributions (#3035)
As the title says. Distributions is in, Templates is out.

`llama stack build --template` --> `llama stack build --distro`. For
backward compatibility, the previous option is kept but results in a
warning.

Updated `server.py` to remove the "config_or_template" backward
compatibility since it has been a couple releases since that change.
2025-08-04 11:34:17 -07:00
..
modules chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
page chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
__init__.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
app.py chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
Containerfile chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
README.md chore: rename templates to distributions (#3035) 2025-08-04 11:34:17 -07:00
requirements.txt chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00

(Experimental) LLama Stack UI

Docker Setup

⚠️ This is a work in progress.

Developer Setup

  1. Start up Llama Stack API server. More details here.
llama stack build --distro together --image-type venv

llama stack run together
  1. (Optional) Register datasets and eval tasks as resources. If you want to run pre-configured evaluation flows (e.g. Evaluations (Generation + Scoring) Page).
llama-stack-client datasets register \
--dataset-id "mmlu" \
--provider-id "huggingface" \
--url "https://huggingface.co/datasets/llamastack/evals" \
--metadata '{"path": "llamastack/evals", "name": "evals__mmlu__details", "split": "train"}' \
--schema '{"input_query": {"type": "string"}, "expected_answer": {"type": "string", "chat_completion_input": {"type": "string"}}}'
llama-stack-client benchmarks register \
--eval-task-id meta-reference-mmlu \
--provider-id meta-reference \
--dataset-id mmlu \
--scoring-functions basic::regex_parser_multiple_choice_answer
  1. Start Streamlit UI
uv run --with ".[ui]" streamlit run llama_stack.core/ui/app.py

Environment Variables

Environment Variable Description Default Value
LLAMA_STACK_ENDPOINT The endpoint for the Llama Stack http://localhost:8321
FIREWORKS_API_KEY API key for Fireworks provider (empty string)
TOGETHER_API_KEY API key for Together provider (empty string)
SAMBANOVA_API_KEY API key for SambaNova provider (empty string)
OPENAI_API_KEY API key for OpenAI provider (empty string)