llama-stack-mirror/llama_stack/distribution
Rohan Awhad d797f9aec1
fix: #2495 FileNotFound Err in container image (#2498)
# What does this PR do?

Closes #2495 

Changes:
- Delay the `COPY run.yaml` into docker image step until after external
provider handling
- Split the check for `external_providers_dir` into “non-empty” and
“directory exists"


## Test Plan

0. Create and Activate venv

1. Create a `simple_build.yaml`
    ```yaml
    version: '2'
    distribution_spec:
      providers:
        inference:
          - remote::openai
    image_type: container
    image_name: openai-stack
    ```

2. Run llama stack build:
```bash
llama stack build --config simple_build.yaml
```

3. Run the docker container:
```bash
docker run \
  -p 8321:8321 \
  -e OPENAI_API_KEY=$OPENAI_API_KEY \
  openai_stack:0.2.12
```

This should show server is running.
```
INFO     2025-06-23 19:07:57,832 llama_stack.distribution.distribution:151 core: Loading external providers from /.llama/providers.d
INFO     2025-06-23 19:07:59,324 __main__:572 server: Listening on ['::', '0.0.0.0']:8321
INFO:     Started server process [1]
INFO:     Waiting for application startup.
INFO     2025-06-23 19:07:59,336 __main__:156 server: Starting up
INFO:     Application startup complete.                                                                             
INFO:     Uvicorn running on http://['::', '0.0.0.0']:8321 (Press CTRL+C to quit)
```

Notice the first line:
```
Loading external providers from /.llama/providers.d
```
This is expected behaviour.

Co-authored-by: Rohan Awhad <rawhad@redhat.com>
2025-06-24 09:08:08 +05:30
..
access_control feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30
routers feat: support auth attributes in inference/responses stores (#2389) 2025-06-20 10:24:45 -07:00
routing_tables feat: fine grained access control policy (#2264) 2025-06-03 14:51:12 -07:00
server feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30
store fix(tools): do not index tools, only index toolgroups (#2261) 2025-05-25 13:27:52 -07:00
ui ci: add python package build test (#2457) 2025-06-19 18:57:32 +05:30
utils refactor: remove container from list of run image types (#2178) 2025-06-02 09:57:55 +02:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
build.py feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30
build_conda_env.sh feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30
build_container.sh fix: #2495 FileNotFound Err in container image (#2498) 2025-06-24 09:08:08 +05:30
build_venv.sh chore: remove straggler references to llama-models (#1345) 2025-03-01 14:26:03 -08:00
client.py chore: make cprint write to stderr (#2250) 2025-05-24 23:39:57 -07:00
common.sh feat(pre-commit): enhance pre-commit hooks with additional checks (#2014) 2025-04-30 11:35:49 -07:00
configure.py feat: refactor external providers dir (#2049) 2025-05-15 20:17:03 +02:00
datatypes.py feat: fine grained access control policy (#2264) 2025-06-03 14:51:12 -07:00
distribution.py ci: fix external provider test (#2438) 2025-06-12 16:14:32 +02:00
inspect.py chore: use starlette built-in Route class (#2267) 2025-05-28 09:53:33 -07:00
library_client.py refactor: unify stream and non-stream impls for responses (#2388) 2025-06-05 17:48:09 +02:00
providers.py feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30
request_headers.py feat: fine grained access control policy (#2264) 2025-06-03 14:51:12 -07:00
resolver.py feat: support auth attributes in inference/responses stores (#2389) 2025-06-20 10:24:45 -07:00
stack.py feat: fine grained access control policy (#2264) 2025-06-03 14:51:12 -07:00
start_stack.sh refactor: remove container from list of run image types (#2178) 2025-06-02 09:57:55 +02:00