llama-stack-mirror/llama_stack/cli
Charlie Doern 025f615868
feat: add support for running in a venv (#1018)
# What does this PR do?

add --image-type to `llama stack run`. Which takes conda, container or
venv also add start_venv.sh which start the stack using a venv

resolves #1007

## Test Plan

running locally:

`llama stack build --template ollama --image-type venv`
`llama stack run --image-type venv
~/.llama/distributions/ollama/ollama-run.yaml`
...
```
llama stack run --image-type venv ~/.llama/distributions/ollama/ollama-run.yaml
Using run configuration: /Users/charliedoern/.llama/distributions/ollama/ollama-run.yaml
+ python -m llama_stack.distribution.server.server --yaml-config /Users/charliedoern/.llama/distributions/ollama/ollama-run.yaml --port 8321
Using config file: /Users/charliedoern/.llama/distributions/ollama/ollama-run.yaml
Run configuration:
apis:
- agents
- datasetio
...
```

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-02-12 11:13:04 -05:00
..
model fix: show proper help text (#1065) 2025-02-12 06:38:25 -08:00
scripts API Updates (#73) 2024-09-17 19:51:35 -07:00
stack feat: add support for running in a venv (#1018) 2025-02-12 11:13:04 -05:00
tests Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
download.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
llama.py Add a verify-download command to llama CLI (#457) 2024-11-14 11:47:51 -08:00
subcommand.py API Updates (#73) 2024-09-17 19:51:35 -07:00
table.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
verify_download.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00