mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-16 06:53:47 +00:00
remote vllm distro
This commit is contained in:
parent
ba82021d4b
commit
02c66b49fc
13 changed files with 188 additions and 18 deletions
|
@ -7,16 +7,22 @@ The `llamastack/distribution-ollama` distribution consists of the following prov
|
|||
| **Provider(s)** | remote::ollama | meta-reference | remote::pgvector, remote::chroma | remote::ollama | meta-reference |
|
||||
|
||||
|
||||
### Docker: Start a Distribution (Single Node GPU)
|
||||
### Docker: Start the Distribution (Single Node regular Desktop machine)
|
||||
|
||||
> [!NOTE]
|
||||
> This will start an ollama server with CPU only, please see [Ollama Documentations](https://github.com/ollama/ollama) for serving models on CPU only.
|
||||
|
||||
```
|
||||
$ cd distributions/ollama; docker compose up
|
||||
```
|
||||
|
||||
### Docker: Start a Distribution (Single Node with nvidia GPUs)
|
||||
|
||||
> [!NOTE]
|
||||
> This assumes you have access to GPU to start a Ollama server with access to your GPU.
|
||||
|
||||
```
|
||||
$ cd distributions/ollama/gpu
|
||||
$ ls
|
||||
compose.yaml run.yaml
|
||||
$ docker compose up
|
||||
$ cd distributions/ollama-gpu; docker compose up
|
||||
```
|
||||
|
||||
You will see outputs similar to following ---
|
||||
|
@ -38,18 +44,6 @@ To kill the server
|
|||
docker compose down
|
||||
```
|
||||
|
||||
### Docker: Start the Distribution (Single Node CPU)
|
||||
|
||||
> [!NOTE]
|
||||
> This will start an ollama server with CPU only, please see [Ollama Documentations](https://github.com/ollama/ollama) for serving models on CPU only.
|
||||
|
||||
```
|
||||
$ cd distributions/ollama/cpu
|
||||
$ ls
|
||||
compose.yaml run.yaml
|
||||
$ docker compose up
|
||||
```
|
||||
|
||||
### Conda: ollama run + llama stack run
|
||||
|
||||
If you wish to separately spin up a Ollama server, and connect with Llama Stack, you may use the following commands.
|
||||
|
|
|
@ -144,7 +144,11 @@ docker compose down
|
|||
|
||||
:::{tab-item} ollama
|
||||
```
|
||||
$ cd llama-stack/distributions/ollama/cpu && docker compose up
|
||||
$ cd llama-stack/distributions/ollama && docker compose up
|
||||
|
||||
# OR
|
||||
|
||||
$ cd llama-stack/distributions/ollama-gpu && docker compose up
|
||||
```
|
||||
|
||||
You will see outputs similar to following ---
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue