diff --git a/.github/workflows/README.md b/.github/workflows/README.md index 059bb873f..7c9d2bffd 100644 --- a/.github/workflows/README.md +++ b/.github/workflows/README.md @@ -21,4 +21,3 @@ Llama Stack uses GitHub Actions for Continuous Integration (CI). Below is a tabl | Test External API and Providers | [test-external.yml](test-external.yml) | Test the External API and Provider mechanisms | | UI Tests | [ui-unit-tests.yml](ui-unit-tests.yml) | Run the UI test suite | | Unit Tests | [unit-tests.yml](unit-tests.yml) | Run the unit test suite | -| Update ReadTheDocs | [update-readthedocs.yml](update-readthedocs.yml) | Update the Llama Stack ReadTheDocs site | diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 14690924d..da0ba5717 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -187,14 +187,16 @@ Note that the provider "description" field will be used to generate the provider ### Building the Documentation -If you are making changes to the documentation at [https://llamastack.github.io/latest/](https://llamastack.github.io/latest/), you can use the following command to build the documentation and preview your changes. You will need [Sphinx](https://www.sphinx-doc.org/en/master/) and the readthedocs theme. +If you are making changes to the documentation at [https://llamastack.github.io/](https://llamastack.github.io/), you can use the following command to build the documentation and preview your changes. ```bash -# This rebuilds the documentation pages. -uv run --group docs make -C docs/ html +# This rebuilds the documentation pages and the OpenAPI spec. +npm install +npm run gen-api-docs all +npm run build -# This will start a local server (usually at http://127.0.0.1:8000) that automatically rebuilds and refreshes when you make changes to the documentation. -uv run --group docs sphinx-autobuild docs/source docs/build/html --write-all +# This will start a local server (usually at http://127.0.0.1:3000). +npm run serve ``` ### Update API Documentation @@ -205,4 +207,4 @@ If you modify or add new API endpoints, update the API documentation accordingly uv run ./docs/openapi_generator/run_openapi_generator.sh ``` -The generated API documentation will be available in `docs/_static/`. Make sure to review the changes before committing. +The generated API schema will be available in `docs/static/`. Make sure to review the changes before committing. diff --git a/docs/README.md b/docs/README.md index 2e03dd80b..1847e49d8 100644 --- a/docs/README.md +++ b/docs/README.md @@ -1,14 +1,17 @@ # Llama Stack Documentation -Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our [Github page](https://llamastack.github.io/latest/getting_started/index.html). +Here's a collection of comprehensive guides, examples, and resources for building AI applications with Llama Stack. For the complete documentation, visit our [Github page](https://llamastack.github.io/getting_started/quickstart). ## Render locally -From the llama-stack root directory, run the following command to render the docs locally: +From the llama-stack `docs/` directory, run the following commands to render the docs locally: ```bash -uv run --group docs sphinx-autobuild docs/source docs/build/html --write-all +npm install +npm run gen-api-docs all +npm run build +npm run serve ``` -You can open up the docs in your browser at http://localhost:8000 +You can open up the docs in your browser at http://localhost:3000 ## Content diff --git a/docs/docs/contributing/index.mdx b/docs/docs/contributing/index.mdx index 8b3f86b03..7f50a058e 100644 --- a/docs/docs/contributing/index.mdx +++ b/docs/docs/contributing/index.mdx @@ -187,14 +187,16 @@ Note that the provider "description" field will be used to generate the provider ### Building the Documentation -If you are making changes to the documentation at [https://llamastack.github.io/latest/](https://llamastack.github.io/latest/), you can use the following command to build the documentation and preview your changes. You will need [Sphinx](https://www.sphinx-doc.org/en/master/) and the readthedocs theme. +If you are making changes to the documentation at [https://llamastack.github.io/](https://llamastack.github.io/), you can use the following command to build the documentation and preview your changes. ```bash -# This rebuilds the documentation pages. -uv run --group docs make -C docs/ html +# This rebuilds the documentation pages and the OpenAPI spec. +npm install +npm run gen-api-docs all +npm run build -# This will start a local server (usually at http://127.0.0.1:8000) that automatically rebuilds and refreshes when you make changes to the documentation. -uv run --group docs sphinx-autobuild docs/source docs/build/html --write-all +# This will start a local server (usually at http://127.0.0.1:3000). +npm run serve ``` ### Update API Documentation @@ -205,7 +207,7 @@ If you modify or add new API endpoints, update the API documentation accordingly uv run ./docs/openapi_generator/run_openapi_generator.sh ``` -The generated API documentation will be available in `docs/_static/`. Make sure to review the changes before committing. +The generated API schema will be available in `docs/static/`. Make sure to review the changes before committing. ## Adding a New Provider diff --git a/docs/docs/index.mdx b/docs/docs/index.mdx index 21e895d3f..bed931fe7 100644 --- a/docs/docs/index.mdx +++ b/docs/docs/index.mdx @@ -45,9 +45,9 @@ Llama Stack consists of a server (with multiple pluggable API providers) and Cli ## Quick Links -- Ready to build? Check out the [Getting Started Guide](https://llama-stack.readthedocs.io/en/latest/getting_started/index.html) to get started. -- Want to contribute? See the [Contributing Guide](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md). -- Explore [Example Applications](https://github.com/meta-llama/llama-stack-apps) built with Llama Stack. +- Ready to build? Check out the [Getting Started Guide](https://llama-stack.github.io/getting_started/quickstart) to get started. +- Want to contribute? See the [Contributing Guide](https://github.com/llamastack/llama-stack/blob/main/CONTRIBUTING.md). +- Explore [Example Applications](https://github.com/llamastack/llama-stack-apps) built with Llama Stack. ## Rich Ecosystem Support @@ -59,13 +59,13 @@ Llama Stack provides adapters for popular providers across all API categories: - **Training & Evaluation**: HuggingFace, TorchTune, NVIDIA NEMO :::info Provider Details -For complete provider compatibility and setup instructions, see our [Providers Documentation](https://llama-stack.readthedocs.io/en/latest/providers/index.html). +For complete provider compatibility and setup instructions, see our [Providers Documentation](https://llamastack.github.io/providers/). ::: ## Get Started Today