chore: clean up distro doc (#1804)

# What does this PR do?
- hide distro doc (docker needs to be thoroughly tested). 

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan
- docs

[//]: # (## Documentation)
This commit is contained in:
Xi Yan 2025-03-27 12:12:14 -07:00 committed by GitHub
parent 81393afb35
commit b5c27f77ad
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
3 changed files with 0 additions and 31 deletions

View file

@ -27,8 +27,6 @@ If you have built a container image and want to deploy it in a Kubernetes cluste
:hidden: :hidden:
importing_as_library importing_as_library
building_distro
configuration configuration
selection
kubernetes_deployment kubernetes_deployment
``` ```

View file

@ -93,7 +93,6 @@ getting_started/index
concepts/index concepts/index
providers/index providers/index
distributions/index distributions/index
distributions/selection
building_applications/index building_applications/index
playground/index playground/index
contributing/index contributing/index

View file

@ -92,8 +92,6 @@ Interactive pages for users to play with and explore Llama Stack API capabilitie
## Starting the Llama Stack Playground ## Starting the Llama Stack Playground
### Llama CLI
To start the Llama Stack Playground, run the following commands: To start the Llama Stack Playground, run the following commands:
1. Start up the Llama Stack API server 1. Start up the Llama Stack API server
@ -109,29 +107,3 @@ cd llama_stack/distribution/ui
pip install -r requirements.txt pip install -r requirements.txt
streamlit run app.py streamlit run app.py
``` ```
### Docker
Playground can also be started in a docker image:
```sh
export LLAMA_STACK_URL=http://localhost:11434
docker run \
--pull always \
-p 8501:8501 \
-e LLAMA_STACK_ENDPOINT=$LLAMA_STACK_URL \
quay.io/jland/llama-stack-playground
```
## Configurable Environment Variables
## Environment Variables
| Environment Variable | Description | Default Value |
|----------------------------|------------------------------------|---------------------------|
| LLAMA_STACK_ENDPOINT | The endpoint for the Llama Stack | http://localhost:8321 |
| FIREWORKS_API_KEY | API key for Fireworks provider | (empty string) |
| TOGETHER_API_KEY | API key for Together provider | (empty string) |
| SAMBANOVA_API_KEY | API key for SambaNova provider | (empty string) |
| OPENAI_API_KEY | API key for OpenAI provider | (empty string) |