docs: remove pure venv references

This commit is contained in:
r3v5 2025-08-06 11:15:47 +01:00
parent 7f834339ba
commit 4b4a532359
No known key found for this signature in database
GPG key ID: 7758B9F272DE67D9
4 changed files with 7 additions and 13 deletions

View file

@ -56,12 +56,12 @@ Breaking down the demo app, this section will show the core pieces that are used
### Setup Remote Inferencing ### Setup Remote Inferencing
Start a Llama Stack server on localhost. Here is an example of how you can do this using the firework.ai distribution: Start a Llama Stack server on localhost. Here is an example of how you can do this using the firework.ai distribution:
``` ```
python -m venv stack-fireworks uv venv starter --python 3.12
source stack-fireworks/bin/activate # On Windows: stack-fireworks\Scripts\activate source starter/bin/activate # On Windows: starter\Scripts\activate
pip install --no-cache llama-stack==0.2.2 pip install --no-cache llama-stack==0.2.2
llama stack build --distro fireworks --image-type venv llama stack build --distro starter --image-type venv
export FIREWORKS_API_KEY=<SOME_KEY> export FIREWORKS_API_KEY=<SOME_KEY>
llama stack run fireworks --port 5050 llama stack run starter --port 5050
``` ```
Ensure the Llama Stack server version is the same as the Kotlin SDK Library for maximum compatibility. Ensure the Llama Stack server version is the same as the Kotlin SDK Library for maximum compatibility.

View file

@ -150,13 +150,7 @@ pip install llama-stack-client
``` ```
::: :::
:::{tab-item} Install with `venv`
```bash
python -m venv stack-client
source stack-client/bin/activate # On Windows: stack-client\Scripts\activate
pip install llama-stack-client
```
:::
:::: ::::
Now let's use the `llama-stack-client` [CLI](../references/llama_stack_client_cli_reference.md) to check the Now let's use the `llama-stack-client` [CLI](../references/llama_stack_client_cli_reference.md) to check the

View file

@ -19,7 +19,7 @@ You have two ways to install Llama Stack:
cd ~/local cd ~/local
git clone git@github.com:meta-llama/llama-stack.git git clone git@github.com:meta-llama/llama-stack.git
python -m venv myenv uv venv myenv --python 3.12
source myenv/bin/activate # On Windows: myenv\Scripts\activate source myenv/bin/activate # On Windows: myenv\Scripts\activate
cd llama-stack cd llama-stack

View file

@ -19,7 +19,7 @@ You have two ways to install Llama Stack:
cd ~/local cd ~/local
git clone git@github.com:meta-llama/llama-stack.git git clone git@github.com:meta-llama/llama-stack.git
python -m venv myenv uv venv myenv --python 3.12
source myenv/bin/activate # On Windows: myenv\Scripts\activate source myenv/bin/activate # On Windows: myenv\Scripts\activate
cd llama-stack cd llama-stack