mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-28 10:54:19 +00:00
Fix links & format (#104)
Fix broken examples link to llama-stack-apps repo Remove extra space in README.md
This commit is contained in:
parent
615ed4bfbc
commit
37be3fb184
3 changed files with 3 additions and 3 deletions
|
@ -5,7 +5,7 @@
|
||||||
|
|
||||||
This repository contains the Llama Stack API specifications as well as API Providers and Llama Stack Distributions.
|
This repository contains the Llama Stack API specifications as well as API Providers and Llama Stack Distributions.
|
||||||
|
|
||||||
The Llama Stack defines and standardizes the building blocks needed to bring generative AI applications to market. These blocks span the entire development lifecycle: from model training and fine-tuning, through product evaluation, to building and running AI agents in production. Beyond definition, we are building providers for the Llama Stack APIs. These we're developing open-source versions and partnering with providers , ensuring developers can assemble AI solutions using consistent, interlocking pieces across platforms. The ultimate goal is to accelerate innovation in the AI space.
|
The Llama Stack defines and standardizes the building blocks needed to bring generative AI applications to market. These blocks span the entire development lifecycle: from model training and fine-tuning, through product evaluation, to building and running AI agents in production. Beyond definition, we are building providers for the Llama Stack APIs. These we're developing open-source versions and partnering with providers, ensuring developers can assemble AI solutions using consistent, interlocking pieces across platforms. The ultimate goal is to accelerate innovation in the AI space.
|
||||||
|
|
||||||
The Stack APIs are rapidly improving, but still very much work in progress and we invite feedback as well as direct contributions.
|
The Stack APIs are rapidly improving, but still very much work in progress and we invite feedback as well as direct contributions.
|
||||||
|
|
||||||
|
|
|
@ -483,4 +483,4 @@ Similarly you can test safety (if you configured llama-guard and/or prompt-guard
|
||||||
python -m llama_stack.apis.safety.client localhost 5000
|
python -m llama_stack.apis.safety.client localhost 5000
|
||||||
```
|
```
|
||||||
|
|
||||||
You can find more example scripts with client SDKs to talk with the Llama Stack server in our [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main/sdk_examples) repo.
|
You can find more example scripts with client SDKs to talk with the Llama Stack server in our [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main/examples) repo.
|
||||||
|
|
|
@ -433,4 +433,4 @@ Similarly you can test safety (if you configured llama-guard and/or prompt-guard
|
||||||
python -m llama_stack.apis.safety.client localhost 5000
|
python -m llama_stack.apis.safety.client localhost 5000
|
||||||
```
|
```
|
||||||
|
|
||||||
You can find more example scripts with client SDKs to talk with the Llama Stack server in our [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps) repo.
|
You can find more example scripts with client SDKs to talk with the Llama Stack server in our [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main/examples) repo.
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue