From a264d0a93d2bebeea9a3e9dccb2c4de8bcdbcd33 Mon Sep 17 00:00:00 2001 From: Justin Lee Date: Fri, 22 Nov 2024 09:15:42 -0800 Subject: [PATCH] fixed pre-commit issues --- docs/zero_to_hero_guide/.env.template | 2 +- docs/zero_to_hero_guide/quickstart.md | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/zero_to_hero_guide/.env.template b/docs/zero_to_hero_guide/.env.template index b9ffbbc04..e748ac0a2 100644 --- a/docs/zero_to_hero_guide/.env.template +++ b/docs/zero_to_hero_guide/.env.template @@ -1 +1 @@ -BRAVE_SEARCH_API_KEY=YOUR_BRAVE_SEARCH_API_KEY \ No newline at end of file +BRAVE_SEARCH_API_KEY=YOUR_BRAVE_SEARCH_API_KEY diff --git a/docs/zero_to_hero_guide/quickstart.md b/docs/zero_to_hero_guide/quickstart.md index 704db8454..3da72a8f6 100644 --- a/docs/zero_to_hero_guide/quickstart.md +++ b/docs/zero_to_hero_guide/quickstart.md @@ -1,6 +1,6 @@ # Quickstart Guide -Llama-Stack allows you to configure your distribution from various providers, allowing you to focus on going from zero to production super fast. +Llama-Stack allows you to configure your distribution from various providers, allowing you to focus on going from zero to production super fast. This guide will walk you through how to build a local distribution, using ollama as an inference provider. @@ -14,7 +14,7 @@ We also have a set of notebooks walking you through how to use Llama-Stack APIs: - Safety API - Agentic API -Below, we will learn how to get started with Ollama as an inference provider, please note the steps for configuring your provider will vary a little depending on the service. However, the user experience will remain universal-this is the power of Llama-Stack. +Below, we will learn how to get started with Ollama as an inference provider, please note the steps for configuring your provider will vary a little depending on the service. However, the user experience will remain universal-this is the power of Llama-Stack. Prototype locally using Ollama, deploy to the cloud with your favorite provider or own deployment. Use any API from any provider while focussing on development. @@ -105,7 +105,7 @@ If you're looking for more specific topics like tool calling or agent setup, we After this step, you will see the console output: ``` -Build Successful! Next steps: +Build Successful! Next steps: 1. Set the environment variables: LLAMASTACK_PORT, OLLAMA_URL, INFERENCE_MODEL, SAFETY_MODEL 2. `llama stack run /Users/username/.llama/distributions/llamastack-ollama/ollama-run.yaml` ```