From 3672e120ff676bad0bab3376eb008f4bc9f849b9 Mon Sep 17 00:00:00 2001 From: Ihar Hrachyshka Date: Tue, 4 Feb 2025 18:25:01 -0500 Subject: [PATCH] [docs] Fix misc typos and formatting issues in intro docs Signed-off-by: Ihar Hrachyshka --- docs/source/distributions/selection.md | 2 +- docs/source/distributions/self_hosted_distro/ollama.md | 4 +++- docs/source/getting_started/index.md | 2 +- 3 files changed, 5 insertions(+), 3 deletions(-) diff --git a/docs/source/distributions/selection.md b/docs/source/distributions/selection.md index aaaf246ee..94ad79d02 100644 --- a/docs/source/distributions/selection.md +++ b/docs/source/distributions/selection.md @@ -23,7 +23,7 @@ Which templates / distributions to choose depends on the hardware you have for r - {dockerhub}`distribution-together` ([Guide](self_hosted_distro/together)) - {dockerhub}`distribution-fireworks` ([Guide](self_hosted_distro/fireworks)) -- **Do you want to run Llama Stack inference on your iOS / Android device** Lastly, we also provide templates for running Llama Stack inference on your iOS / Android device: +- **Do you want to run Llama Stack inference on your iOS / Android device?** Lastly, we also provide templates for running Llama Stack inference on your iOS / Android device: - [iOS SDK](ondevice_distro/ios_sdk) - [Android](ondevice_distro/android_sdk) diff --git a/docs/source/distributions/self_hosted_distro/ollama.md b/docs/source/distributions/self_hosted_distro/ollama.md index 92e1f7dbf..e7c729501 100644 --- a/docs/source/distributions/self_hosted_distro/ollama.md +++ b/docs/source/distributions/self_hosted_distro/ollama.md @@ -25,7 +25,9 @@ The `llamastack/distribution-ollama` distribution consists of the following prov | vector_io | `inline::faiss`, `remote::chromadb`, `remote::pgvector` | -You should use this distribution if you have a regular desktop machine without very powerful GPUs. Of course, if you have powerful GPUs, you can still continue using this distribution since Ollama supports GPU acceleration.### Environment Variables +You should use this distribution if you have a regular desktop machine without very powerful GPUs. Of course, if you have powerful GPUs, you can still continue using this distribution since Ollama supports GPU acceleration. + +### Environment Variables The following environment variables can be configured: diff --git a/docs/source/getting_started/index.md b/docs/source/getting_started/index.md index 00301d6e7..a2a38e6b4 100644 --- a/docs/source/getting_started/index.md +++ b/docs/source/getting_started/index.md @@ -1,6 +1,6 @@ # Quick Start -In this guide, we'll walk through how you can use the Llama Stack (server and client SDK ) to test a simple RAG agent. +In this guide, we'll walk through how you can use the Llama Stack (server and client SDK) to test a simple RAG agent. A Llama Stack agent is a simple integrated system that can perform tasks by combining a Llama model for reasoning with tools (e.g., RAG, web search, code execution, etc.) for taking actions.