From a76dd8c464e9f4df184dadc17f1e9afbd9ab82f0 Mon Sep 17 00:00:00 2001 From: Francisco Javier Arceo Date: Fri, 21 Feb 2025 16:30:47 -0500 Subject: [PATCH] Updated language to say inline instead of local Signed-off-by: Francisco Javier Arceo --- docs/source/concepts/index.md | 2 +- docs/source/providers/index.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/concepts/index.md b/docs/source/concepts/index.md index df46e0134..27eb74f00 100644 --- a/docs/source/concepts/index.md +++ b/docs/source/concepts/index.md @@ -33,7 +33,7 @@ Providers come in two flavors: - **Remote**: the provider runs as a separate service external to the Llama Stack codebase. Llama Stack contains a small amount of adapter code. - **Inline**: the provider is fully specified and implemented within the Llama Stack codebase. It may be a simple wrapper around an existing library, or a full fledged implementation within Llama Stack. -Most importantly, Llama Stack always strives to provide at least one fully "local" provider for each API so you can iterate on a fully featured environment locally. +Most importantly, Llama Stack always strives to provide at least one fully inline provider for each API so you can iterate on a fully featured environment locally. ## Resources Some of these APIs are associated with a set of **Resources**. Here is the mapping of APIs to resources: diff --git a/docs/source/providers/index.md b/docs/source/providers/index.md index 4371ca623..cc654823e 100644 --- a/docs/source/providers/index.md +++ b/docs/source/providers/index.md @@ -9,7 +9,7 @@ Providers come in two flavors: - **Remote**: the provider runs as a separate service external to the Llama Stack codebase. Llama Stack contains a small amount of adapter code. - **Inline**: the provider is fully specified and implemented within the Llama Stack codebase. It may be a simple wrapper around an existing library, or a full fledged implementation within Llama Stack. -Importantly, Llama Stack always strives to provide at least one fully "local" provider for each API so you can iterate on a fully featured environment locally. +Importantly, Llama Stack always strives to provide at least one fully inline provider for each API so you can iterate on a fully featured environment locally. ## Agents Run multi-step agentic workflows with LLMs with tool usage, memory (RAG), etc.