diff --git a/.github/workflows/publish-to-test-pypi.yml b/.github/workflows/publish-to-test-pypi.yml index 9fe502254..2e8aaab23 100644 --- a/.github/workflows/publish-to-test-pypi.yml +++ b/.github/workflows/publish-to-test-pypi.yml @@ -238,7 +238,7 @@ jobs: run: | pip install pytest nbval llama stack build --template together --image-type venv - pytest -v -s --nbval-lax ./docs/notebooks/Llama_Stack_Building_AI_Applications.ipynb + pytest -v -s --nbval-lax ./docs/getting_started.ipynb pytest -v -s --nbval-lax ./docs/notebooks/Llama_Stack_Benchmark_Evals.ipynb # TODO: add trigger for integration test workflow & docker builds diff --git a/README.md b/README.md index 9d6009bae..17acd0096 100644 --- a/README.md +++ b/README.md @@ -4,7 +4,7 @@ [![PyPI - Downloads](https://img.shields.io/pypi/dm/llama-stack)](https://pypi.org/project/llama-stack/) [![Discord](https://img.shields.io/discord/1257833999603335178)](https://discord.gg/llama-stack) -[**Quick Start**](https://llama-stack.readthedocs.io/en/latest/getting_started/index.html) | [**Documentation**](https://llama-stack.readthedocs.io/en/latest/index.html) | [**Colab Notebook**](./docs/notebooks/Llama_Stack_Building_AI_Applications.ipynb) +[**Quick Start**](https://llama-stack.readthedocs.io/en/latest/getting_started/index.html) | [**Documentation**](https://llama-stack.readthedocs.io/en/latest/index.html) | [**Colab Notebook**](./docs/getting_started.ipynb) Llama Stack defines and standardizes the core building blocks that simplify AI application development. It codified best practices across the Llama ecosystem. More specifically, it provides @@ -97,7 +97,7 @@ Please checkout our [Documentation](https://llama-stack.readthedocs.io/en/latest * Guide using `llama` CLI to work with Llama models (download, study prompts), and building/starting a Llama Stack distribution. * [Getting Started](https://llama-stack.readthedocs.io/en/latest/getting_started/index.html) * Quick guide to start a Llama Stack server. - * [Jupyter notebook](./docs/notebooks/Llama_Stack_Building_AI_Applications.ipynb) to walk-through how to use simple text and vision inference llama_stack_client APIs + * [Jupyter notebook](./docs/getting_started.ipynb) to walk-through how to use simple text and vision inference llama_stack_client APIs * The complete Llama Stack lesson [Colab notebook](https://colab.research.google.com/drive/1dtVmxotBsI4cGZQNsJRYPrLiDeT0Wnwt) of the new [Llama 3.2 course on Deeplearning.ai](https://learn.deeplearning.ai/courses/introducing-multimodal-llama-3-2/lesson/8/llama-stack). * A [Zero-to-Hero Guide](https://github.com/meta-llama/llama-stack/tree/main/docs/zero_to_hero_guide) that guide you through all the key components of llama stack with code samples. * [Contributing](CONTRIBUTING.md) diff --git a/docs/notebooks/Llama_Stack_Building_AI_Applications.ipynb b/docs/getting_started.ipynb similarity index 100% rename from docs/notebooks/Llama_Stack_Building_AI_Applications.ipynb rename to docs/getting_started.ipynb diff --git a/docs/source/building_applications/telemetry.md b/docs/source/building_applications/telemetry.md index ee640398b..45bc7a1c2 100644 --- a/docs/source/building_applications/telemetry.md +++ b/docs/source/building_applications/telemetry.md @@ -77,4 +77,4 @@ Once the Jaeger instance is running, you can visualize traces by navigating to h ## Querying Traces Stored in SQLIte -The `sqlite` sink allows you to query traces without an external system. Here are some example queries. Refer to the notebook at [Llama Stack Building AI Applications](https://github.com/meta-llama/llama-stack/blob/main/docs/notebooks/Llama_Stack_Building_AI_Applications.ipynb) for more examples on how to query traces and spaces. +The `sqlite` sink allows you to query traces without an external system. Here are some example queries. Refer to the notebook at [Llama Stack Building AI Applications](https://github.com/meta-llama/llama-stack/blob/main/docs/getting_started.ipynb) for more examples on how to query traces and spaces. diff --git a/docs/source/building_applications/tools.md b/docs/source/building_applications/tools.md index 1339a14ae..81b4ab68e 100644 --- a/docs/source/building_applications/tools.md +++ b/docs/source/building_applications/tools.md @@ -7,7 +7,7 @@ Tools are treated as any other resource in llama stack like models. You can regi When instatiating an agent, you can provide it a list of tool groups that it has access to. Agent gets the corresponding tool definitions for the specified tool groups and passes them along to the model. -Refer to the [Building AI Applications](https://github.com/meta-llama/llama-stack/blob/main/docs/notebooks/Llama_Stack_Building_AI_Applications.ipynb) notebook for more examples on how to use tools. +Refer to the [Building AI Applications](https://github.com/meta-llama/llama-stack/blob/main/docs/getting_started.ipynb) notebook for more examples on how to use tools. ## Types of Tool Group providers