mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-16 19:59:26 +00:00
Update quickstart.md
This commit is contained in:
parent
4deb95eaae
commit
06b699f5ec
1 changed files with 3 additions and 1 deletions
|
|
@ -14,7 +14,9 @@ We also have a set of notebooks walking you through how to use Llama-Stack APIs:
|
||||||
- Safety API
|
- Safety API
|
||||||
- Agentic API
|
- Agentic API
|
||||||
|
|
||||||
Below, we will learn how to get started with Ollama as an inference provider, please note the steps for configuring your provider will vary a little depending on the service. However, the user experience will remain universal.
|
Below, we will learn how to get started with Ollama as an inference provider, please note the steps for configuring your provider will vary a little depending on the service. However, the user experience will remain universal-this is the power of Llama-Stack.
|
||||||
|
|
||||||
|
Prototype locally using Ollama, deploy to the cloud with your favorite provider or own deployment. Use any API from any provider while focussing on development.
|
||||||
|
|
||||||
# Ollama Quickstart Guide
|
# Ollama Quickstart Guide
|
||||||
|
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue