llama-stack-mirror/llama_stack/distribution/ui/README.md
Sébastien Han 770b38f8b5
chore: simplify running the demo UI (#1907)
# What does this PR do?

* Manage UI deps in pyproject
* Use a new "ui" dep group to pull the deps with "uv"
* Simplify the run command
* Bump versions in requirements.txt

Signed-off-by: Sébastien Han <seb@redhat.com>
2025-04-09 11:22:29 -07:00

1.8 KiB

(Experimental) LLama Stack UI

Docker Setup

⚠️ This is a work in progress.

Developer Setup

  1. Start up Llama Stack API server. More details here.
llama stack build --template together --image-type conda

llama stack run together
  1. (Optional) Register datasets and eval tasks as resources. If you want to run pre-configured evaluation flows (e.g. Evaluations (Generation + Scoring) Page).
llama-stack-client datasets register \
--dataset-id "mmlu" \
--provider-id "huggingface" \
--url "https://huggingface.co/datasets/llamastack/evals" \
--metadata '{"path": "llamastack/evals", "name": "evals__mmlu__details", "split": "train"}' \
--schema '{"input_query": {"type": "string"}, "expected_answer": {"type": "string", "chat_completion_input": {"type": "string"}}}'
llama-stack-client benchmarks register \
--eval-task-id meta-reference-mmlu \
--provider-id meta-reference \
--dataset-id mmlu \
--scoring-functions basic::regex_parser_multiple_choice_answer
  1. Start Streamlit UI
uv run --with ".[ui]" streamlit run llama_stack/distribution/ui/app.py

Environment Variables

Environment Variable Description Default Value
LLAMA_STACK_ENDPOINT The endpoint for the Llama Stack http://localhost:8321
FIREWORKS_API_KEY API key for Fireworks provider (empty string)
TOGETHER_API_KEY API key for Together provider (empty string)
SAMBANOVA_API_KEY API key for SambaNova provider (empty string)
OPENAI_API_KEY API key for OpenAI provider (empty string)