mirror of
https://github.com/meta-llama/llama-stack.git
synced 2026-01-02 09:30:01 +00:00
docs: Updated documentation and configuration to make things easier for the unfamiliar
Signed-off-by: Francisco Javier Arceo <farceo@redhat.com>
This commit is contained in:
parent
9b478f3756
commit
2847216efb
10 changed files with 69 additions and 32 deletions
|
|
@ -6,6 +6,7 @@ Llama Stack {{ llama_stack_version }} is now available! See the {{ llama_stack_v
|
|||
|
||||
# Llama Stack
|
||||
|
||||
## What is Llama Stack?
|
||||
|
||||
Llama Stack defines and standardizes the core building blocks needed to bring generative AI applications to market. It provides a unified set of APIs with implementations from leading service providers, enabling seamless transitions between development and production environments. More specifically, it provides
|
||||
|
||||
|
|
@ -22,6 +23,12 @@ Llama Stack defines and standardizes the core building blocks needed to bring ge
|
|||
|
||||
Our goal is to provide pre-packaged implementations (aka "distributions") which can be run in a variety of deployment environments. LlamaStack can assist you in your entire app development lifecycle - start iterating on local, mobile or desktop and seamlessly transition to on-prem or public cloud deployments. At every point in this transition, the same set of APIs and the same developer experience is available.
|
||||
|
||||
## How does Llama Stack work?
|
||||
Llama Stack consists of a [server](./distributions/index.md) (with multiple pluggable API [providers](./providers/index.md)) and [client SDKs](#available-sdks) meant to
|
||||
be used in your applications. The server can be run in a variety of environments, including local (inline)
|
||||
development, on-premises, and cloud. The client SDKs are available for Python, Swift, Node, and
|
||||
Kotlin.
|
||||
|
||||
## Quick Links
|
||||
|
||||
- New to Llama Stack? Start with the [Introduction](introduction/index) to understand our motivation and vision.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue