mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-26 06:07:43 +00:00
prototype: use pyproject and uv to build distribution
Goals: * remove the need of a custom tool to install a collection of python packages AKA `llama stack build` * use the power of 'uv', which was designed to manage dependencies * `llama stack build` can "probably" go away and be replaced with uv Howto, with the pyproject, you can install an Ollama distribution in a virtual env like so: ``` uv venv --python 3.10 ollama-distro source ollama-distro/bin/activate uv sync --extra ollama llama stack run llama_stack/templates/ollama/run.yaml ``` Caveats: * external provider, we could still use a build file or add the known external providers to the pyproject? * growth of the uv.lock? We create a requirements.txt for convenience as some users are most familiar with this format than looking at pyproject. Signed-off-by: Sébastien Han <seb@redhat.com>
This commit is contained in:
parent
6832e8a658
commit
b6ebbe1bc0
13 changed files with 5579 additions and 679 deletions
6
.github/workflows/integration-tests.yml
vendored
6
.github/workflows/integration-tests.yml
vendored
|
@ -41,16 +41,12 @@ jobs:
|
|||
- name: Setup ollama
|
||||
uses: ./.github/actions/setup-ollama
|
||||
|
||||
- name: Build Llama Stack
|
||||
run: |
|
||||
uv run llama stack build --template ollama --image-type venv
|
||||
|
||||
- name: Start Llama Stack server in background
|
||||
if: matrix.client-type == 'http'
|
||||
env:
|
||||
INFERENCE_MODEL: "meta-llama/Llama-3.2-3B-Instruct"
|
||||
run: |
|
||||
LLAMA_STACK_LOG_FILE=server.log nohup uv run llama stack run ./llama_stack/templates/ollama/run.yaml --image-type venv --env OLLAMA_URL="http://0.0.0.0:11434" &
|
||||
LLAMA_STACK_LOG_FILE=server.log nohup uv run llama stack run ./llama_stack/templates/ollama/run.yaml --env OLLAMA_URL="http://0.0.0.0:11434" &
|
||||
|
||||
- name: Wait for Llama Stack server to be ready
|
||||
if: matrix.client-type == 'http'
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue