diff --git a/docs/source/distributions/ondevice_distro/android_sdk.md b/docs/source/distributions/ondevice_distro/android_sdk.md index 979acd913..9d16d07d7 100644 --- a/docs/source/distributions/ondevice_distro/android_sdk.md +++ b/docs/source/distributions/ondevice_distro/android_sdk.md @@ -56,12 +56,12 @@ Breaking down the demo app, this section will show the core pieces that are used ### Setup Remote Inferencing Start a Llama Stack server on localhost. Here is an example of how you can do this using the firework.ai distribution: ``` -python -m venv stack-fireworks -source stack-fireworks/bin/activate # On Windows: stack-fireworks\Scripts\activate +uv venv starter --python 3.12 +source starter/bin/activate # On Windows: starter\Scripts\activate pip install --no-cache llama-stack==0.2.2 -llama stack build --distro fireworks --image-type venv +llama stack build --distro starter --image-type venv export FIREWORKS_API_KEY= -llama stack run fireworks --port 5050 +llama stack run starter --port 5050 ``` Ensure the Llama Stack server version is the same as the Kotlin SDK Library for maximum compatibility. diff --git a/docs/source/getting_started/detailed_tutorial.md b/docs/source/getting_started/detailed_tutorial.md index ff2eaead4..14f888628 100644 --- a/docs/source/getting_started/detailed_tutorial.md +++ b/docs/source/getting_started/detailed_tutorial.md @@ -150,13 +150,7 @@ pip install llama-stack-client ``` ::: -:::{tab-item} Install with `venv` -```bash -python -m venv stack-client -source stack-client/bin/activate # On Windows: stack-client\Scripts\activate -pip install llama-stack-client -``` -::: + :::: Now let's use the `llama-stack-client` [CLI](../references/llama_stack_client_cli_reference.md) to check the diff --git a/docs/source/references/llama_cli_reference/download_models.md b/docs/source/references/llama_cli_reference/download_models.md index c44ba7788..e32099023 100644 --- a/docs/source/references/llama_cli_reference/download_models.md +++ b/docs/source/references/llama_cli_reference/download_models.md @@ -19,7 +19,7 @@ You have two ways to install Llama Stack: cd ~/local git clone git@github.com:meta-llama/llama-stack.git - python -m venv myenv + uv venv myenv --python 3.12 source myenv/bin/activate # On Windows: myenv\Scripts\activate cd llama-stack diff --git a/docs/source/references/llama_cli_reference/index.md b/docs/source/references/llama_cli_reference/index.md index fc7751ebf..4ef76fe7d 100644 --- a/docs/source/references/llama_cli_reference/index.md +++ b/docs/source/references/llama_cli_reference/index.md @@ -19,7 +19,7 @@ You have two ways to install Llama Stack: cd ~/local git clone git@github.com:meta-llama/llama-stack.git - python -m venv myenv + uv venv myenv --python 3.12 source myenv/bin/activate # On Windows: myenv\Scripts\activate cd llama-stack