docs: remove pure venv references (#3047)

# What does this PR do?
<!-- Provide a short summary of what this PR does and why. Link to
relevant issues if applicable. -->
Remove pure venv (without uv) references in docs

<!-- If resolving an issue, uncomment and update the line below -->
<!-- Closes #[issue-number] -->

## Test Plan
<!-- Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.* -->
This commit is contained in:
IAN MILLER 2025-08-06 18:42:34 +01:00 committed by GitHub
parent e9fced773a
commit 8ba04205ac
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
4 changed files with 7 additions and 13 deletions

View file

@ -56,12 +56,12 @@ Breaking down the demo app, this section will show the core pieces that are used
### Setup Remote Inferencing
Start a Llama Stack server on localhost. Here is an example of how you can do this using the firework.ai distribution:
```
python -m venv stack-fireworks
source stack-fireworks/bin/activate # On Windows: stack-fireworks\Scripts\activate
uv venv starter --python 3.12
source starter/bin/activate # On Windows: starter\Scripts\activate
pip install --no-cache llama-stack==0.2.2
llama stack build --distro fireworks --image-type venv
llama stack build --distro starter --image-type venv
export FIREWORKS_API_KEY=<SOME_KEY>
llama stack run fireworks --port 5050
llama stack run starter --port 5050
```
Ensure the Llama Stack server version is the same as the Kotlin SDK Library for maximum compatibility.

View file

@ -150,13 +150,7 @@ pip install llama-stack-client
```
:::
:::{tab-item} Install with `venv`
```bash
python -m venv stack-client
source stack-client/bin/activate # On Windows: stack-client\Scripts\activate
pip install llama-stack-client
```
:::
::::
Now let's use the `llama-stack-client` [CLI](../references/llama_stack_client_cli_reference.md) to check the

View file

@ -19,7 +19,7 @@ You have two ways to install Llama Stack:
cd ~/local
git clone git@github.com:meta-llama/llama-stack.git
python -m venv myenv
uv venv myenv --python 3.12
source myenv/bin/activate # On Windows: myenv\Scripts\activate
cd llama-stack

View file

@ -19,7 +19,7 @@ You have two ways to install Llama Stack:
cd ~/local
git clone git@github.com:meta-llama/llama-stack.git
python -m venv myenv
uv venv myenv --python 3.12
source myenv/bin/activate # On Windows: myenv\Scripts\activate
cd llama-stack