llama-stack/docs/source/distributions
Ihar Hrachyshka 18bac27d4e
fix: Use CONDA_DEFAULT_ENV presence as a flag to use conda mode (#1555)
# What does this PR do?

This is the second attempt to switch to system packages by default. Now
with a hack to detect conda environment - in which case conda image-type
is used.

Note: Conda will only be used when --image-name is unset *and*
CONDA_DEFAULT_ENV is set. This means that users without conda will
correctly fall back to using system packages when no --image-* arguments
are passed at all.

[//]: # (If resolving an issue, uncomment and update the line below)
[//]: # (Closes #[issue-number])

## Test Plan

Uses virtualenv:

```
$ llama stack build --template ollama --image-type venv
$ llama stack run --image-type venv ~/.llama/distributions/ollama/ollama-run.yaml
[...]
Using virtual environment: /home/ec2-user/src/llama-stack/schedule/.local
[...]
```

Uses system packages (virtualenv already initialized):

```
$ llama stack run ~/.llama/distributions/ollama/ollama-run.yaml
[...]
INFO     2025-03-27 20:46:22,882 llama_stack.cli.stack.run:142 server: No image type or image name provided. Assuming environment packages.
[...]
```

Attempt to run from environment packages without necessary packages
installed:
```
$ python -m venv barebones
$ . ./barebones/bin/activate
$ pip install -e . # to install llama command
$ llama stack run ~/.llama/distributions/ollama/ollama-run.yaml
[...]
ModuleNotFoundError: No module named 'fastapi'
```

^ failed as expected because the environment doesn't have necessary
packages installed.

Now install some packages in the new environment:

```
$ pip install fastapi opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp aiosqlite ollama openai datasets faiss-cpu mcp autoevals
$ llama stack run ~/.llama/distributions/ollama/ollama-run.yaml
[...]
Uvicorn running on http://['::', '0.0.0.0']:8321 (Press CTRL+C to quit)
```

Now see if setting CONDA_DEFAULT_ENV will change what happens by
default:

```
$ export CONDA_DEFAULT_ENV=base
$ llama stack run ~/.llama/distributions/ollama/ollama-run.yaml
[...]
Using conda environment: base
Conda environment base does not exist.
[...]
```

---------

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-03-27 17:13:22 -04:00
..
ondevice_distro docs: Fix trailing whitespace error (#1669) 2025-03-17 08:53:30 -07:00
remote_hosted_distro feat: Add nemo customizer (#1448) 2025-03-25 11:01:10 -07:00
self_hosted_distro docs: fix remote-vllm instructions (#1805) 2025-03-27 10:19:51 -04:00
building_distro.md fix: Use CONDA_DEFAULT_ENV presence as a flag to use conda mode (#1555) 2025-03-27 17:13:22 -04:00
configuration.md script for running client sdk tests (#895) 2025-02-19 22:38:06 -08:00
importing_as_library.md Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
index.md chore: clean up distro doc (#1804) 2025-03-27 12:12:14 -07:00
kubernetes_deployment.md docs: Simplify vLLM deployment in K8s deployment guide (#1655) 2025-03-24 09:08:50 -07:00
selection.md docs: small fixes (#1224) 2025-02-24 07:59:58 -05:00