Commit graph

6 commits

Author SHA1 Message Date
Ashwin Bharambe
350d5449bd fix(ci): unset UV index env vars before distribution deps install
Explicitly unset UV_EXTRA_INDEX_URL and UV_INDEX_STRATEGY before installing
distribution dependencies to ensure they only use PyPI (not test.pypi).

This prevents UV from using any residual index configuration from the
llama-stack installation step which could cause it to look for packages
on test.pypi where binary wheels may not be available.
2025-10-31 10:34:15 -07:00
Ashwin Bharambe
58fe7d7b59 fix: only set UV_INDEX_STRATEGY when UV_EXTRA_INDEX_URL is also set
When UV_EXTRA_INDEX_URL is empty (on main branch), we shouldn't set
UV_INDEX_STRATEGY at all, even to an empty value. UV interprets an
empty UV_INDEX_STRATEGY env var as requiring a value, causing errors.
2025-10-31 10:08:33 -07:00
Ashwin Bharambe
a51a4317b3 fix: scope UV env vars to llama-stack install only
The previous change set UV_EXTRA_INDEX_URL and UV_INDEX_STRATEGY as
persistent ENV vars, which affected all subsequent uv commands including
distribution dependency installation. This caused conflicts when deps
have their own --extra-index-url (like PyTorch packages).

Now these env vars are only used inline for the llama-stack installation
step in editable mode, keeping distribution deps unaffected.
2025-10-31 10:05:39 -07:00
Ashwin Bharambe
98fa43fd94 fix: pass UV index config to docker build for RC dependencies
Docker builds on release branches need access to UV_EXTRA_INDEX_URL
and UV_INDEX_STRATEGY to resolve RC client dependencies from test.pypi.

Changes:
- Add UV_EXTRA_INDEX_URL and UV_INDEX_STRATEGY build args to Containerfile
- Pass these env vars as build args in integration-tests.sh
- ENV variables are now available during uv pip install in Docker builds
2025-10-31 09:52:33 -07:00
ehhuang
ab2d5febb4
chore: install client first (#3862)
# What does this PR do?
mirrors build_container.sh

trying to resolve: 

0.105 + [ editable = editable ]
0.105 + [ ! -d /workspace/llama-stack ]
0.105 + uv pip install --no-cache-dir -e /workspace/llama-stack
0.261 Using Python 3.12.12 environment at: /usr/local
0.479   × No solution found when resolving dependencies:
0.479   ╰─▶ Because only llama-stack-client<=0.2.23 is available and
0.479 llama-stack==0.3.0rc4 depends on llama-stack-client>=0.3.0rc4, we
can
0.479       conclude that llama-stack==0.3.0rc4 cannot be used.
0.479 And because only llama-stack==0.3.0rc4 is available and you
require
0.479 llama-stack, we can conclude that your requirements are
unsatisfiable.
------

## Test Plan
2025-10-20 14:56:45 -07:00
ehhuang
21772de5d3
chore: use dockerfile for building containers (#3839)
# What does this PR do?

relates to #2878 

We introduce a Containerfile which is used to replaced the `llama stack
build` command (removal in a separate PR).

```
llama stack build --distro starter --image-type venv --run
```
is replaced by
```
llama stack list-deps starter | xargs -L1 uv pip install
llama stack run starter
```


- See the updated workflow files for e2e workflow.

## Test Plan
CI
```
❯ docker build . -f docker/Dockerfile --build-arg DISTRO_NAME=starter --build-arg INSTALL_MODE=editable --tag test_starter
❯ docker run -p 8321:8321 test_starter
❯ curl http://localhost:8321/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [
      {
        "role": "user",
        "content": "Hello!"
      }
    ]
  }'
```





---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/llamastack/llama-stack/pull/3839).
* #3855
* __->__ #3839
2025-10-20 10:23:01 -07:00