mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-21 16:07:16 +00:00
# What does this PR do? mirrors build_container.sh trying to resolve: 0.105 + [ editable = editable ] 0.105 + [ ! -d /workspace/llama-stack ] 0.105 + uv pip install --no-cache-dir -e /workspace/llama-stack 0.261 Using Python 3.12.12 environment at: /usr/local 0.479 × No solution found when resolving dependencies: 0.479 ╰─▶ Because only llama-stack-client<=0.2.23 is available and 0.479 llama-stack==0.3.0rc4 depends on llama-stack-client>=0.3.0rc4, we can 0.479 conclude that llama-stack==0.3.0rc4 cannot be used. 0.479 And because only llama-stack==0.3.0rc4 is available and you require 0.479 llama-stack, we can conclude that your requirements are unsatisfiable. ------ ## Test Plan |
||
---|---|---|
.. | ||
Containerfile |