From 9613c63ca67b58f3eac5158a0fff46ccc3523719 Mon Sep 17 00:00:00 2001 From: skamenan7 Date: Mon, 14 Jul 2025 11:22:15 -0400 Subject: [PATCH] docs: clarify run.yaml files are starting points for customization - Add new documentation section on customizing run.yaml files - Clarify that generated run.yaml files are templates, not production configs - Add guidance on customization best practices and common scenarios - Update existing documentation to reference customization guide - Improve clarity around run.yaml file usage for better user experience This enhancement makes it clearer for users that run.yaml files are meant to be customized for production use, improving the overall developer experience. --- docs/source/distributions/building_distro.md | 4 ++ docs/source/distributions/configuration.md | 4 ++ .../distributions/customizing_run_yaml.md | 40 +++++++++++++++++++ docs/source/distributions/index.md | 1 + .../getting_started/detailed_tutorial.md | 2 +- 5 files changed, 50 insertions(+), 1 deletion(-) create mode 100644 docs/source/distributions/customizing_run_yaml.md diff --git a/docs/source/distributions/building_distro.md b/docs/source/distributions/building_distro.md index f24974dd3..cd2c6b6a8 100644 --- a/docs/source/distributions/building_distro.md +++ b/docs/source/distributions/building_distro.md @@ -145,6 +145,10 @@ $ llama stack build --template starter ... You can now edit ~/.llama/distributions/llamastack-starter/starter-run.yaml and run `llama stack run ~/.llama/distributions/llamastack-starter/starter-run.yaml` ``` + +```{tip} +The generated `run.yaml` file is a starting point for your configuration. For comprehensive guidance on customizing it for your specific needs, infrastructure, and deployment scenarios, see [Customizing Your run.yaml Configuration](customizing_run_yaml.md). +``` ::: :::{tab-item} Building from Scratch diff --git a/docs/source/distributions/configuration.md b/docs/source/distributions/configuration.md index 4709cb8c6..9548780c6 100644 --- a/docs/source/distributions/configuration.md +++ b/docs/source/distributions/configuration.md @@ -2,6 +2,10 @@ The Llama Stack runtime configuration is specified as a YAML file. Here is a simplified version of an example configuration file for the Ollama distribution: +```{note} +The default `run.yaml` files generated by templates are starting points for your configuration. For guidance on customizing these files for your specific needs, see [Customizing Your run.yaml Configuration](customizing_run_yaml.md). +``` + ```{dropdown} 👋 Click here for a Sample Configuration File ```yaml diff --git a/docs/source/distributions/customizing_run_yaml.md b/docs/source/distributions/customizing_run_yaml.md new file mode 100644 index 000000000..10067bab7 --- /dev/null +++ b/docs/source/distributions/customizing_run_yaml.md @@ -0,0 +1,40 @@ +# Customizing run.yaml Files + +The `run.yaml` files generated by Llama Stack templates are **starting points** designed to be customized for your specific needs. They are not meant to be used as-is in production environments. + +## Key Points + +- **Templates are starting points**: Generated `run.yaml` files contain defaults for development/testing +- **Customization expected**: Update URLs, credentials, models, and settings for your environment +- **Version control separately**: Keep customized configs in your own repository +- **Environment-specific**: Create different configurations for dev, staging, production + +## What You Can Customize + +You can customize: +- **Provider endpoints**: Change `http://localhost:8000` to your actual servers +- **Swap providers**: Replace default providers (e.g., swap Tavily with Brave for search) +- **Storage paths**: Move from `/tmp/` to production directories +- **Authentication**: Add API keys, SSL, timeouts +- **Models**: Different model sizes for dev vs prod +- **Database settings**: Switch from SQLite to PostgreSQL +- **Tool configurations**: Add custom tools and integrations + +## Best Practices + +- Use environment variables for secrets and environment-specific values +- Create separate `run.yaml` files for different environments (dev, staging, prod) +- Document your changes with comments +- Test configurations before deployment +- Keep your customized configs in version control + +Example structure: +``` +your-project/ +├── configs/ +│ ├── dev-run.yaml +│ ├── prod-run.yaml +└── README.md +``` + +The goal is to take the generated template and adapt it to your specific infrastructure and operational needs. \ No newline at end of file diff --git a/docs/source/distributions/index.md b/docs/source/distributions/index.md index 103a6131f..600eec3a1 100644 --- a/docs/source/distributions/index.md +++ b/docs/source/distributions/index.md @@ -9,6 +9,7 @@ This section provides an overview of the distributions available in Llama Stack. importing_as_library configuration +customizing_run_yaml list_of_distributions kubernetes_deployment building_distro diff --git a/docs/source/getting_started/detailed_tutorial.md b/docs/source/getting_started/detailed_tutorial.md index 35cb7f02e..97e7df774 100644 --- a/docs/source/getting_started/detailed_tutorial.md +++ b/docs/source/getting_started/detailed_tutorial.md @@ -54,7 +54,7 @@ Llama Stack is a server that exposes multiple APIs, you connect with it using th You can use Python to build and run the Llama Stack server, which is useful for testing and development. Llama Stack uses a [YAML configuration file](../distributions/configuration.md) to specify the stack setup, -which defines the providers and their settings. +which defines the providers and their settings. The generated configuration serves as a starting point that you can [customize for your specific needs](../distributions/customizing_run_yaml.md). Now let's build and run the Llama Stack config for Ollama. We use `starter` as template. By default all providers are disabled, this requires enable ollama by passing environment variables.