update readme

This commit is contained in:
Xi Yan 2024-09-10 11:35:27 -07:00
parent 1e978e16b1
commit be9e488e56

View file

@ -295,13 +295,13 @@ As you can see above, each “distribution” details the “providers” it is
Let's imagine you are working with a 8B-Instruct model. The following command will build a package (in the form of a Conda environment) _and_ configure it. As part of the configuration, you will be asked for some inputs (model_id, max_seq_len, etc.) Since we are working with a 8B model, we will name our build `8b-instruct` to help us remember the config.
```
llama stack build local --name 8b-instruct
llama stack build --distribution local --name 8b-instruct
```
Once it runs successfully , you should see some outputs in the form:
```
$ llama stack build local --name 8b-instruct
$ llama stack build --distribution local --name 8b-instruct
....
....
Successfully installed cfgv-3.4.0 distlib-0.3.8 identify-2.6.0 libcst-1.4.0 llama_toolchain-0.0.2 moreorless-0.4.0 nodeenv-1.9.1 pre-commit-3.8.0 stdlibs-2024.5.15 toml-0.10.2 tomlkit-0.13.0 trailrunner-1.4.0 ufmt-2.7.0 usort-1.0.8 virtualenv-20.26.3
@ -312,12 +312,35 @@ Successfully setup conda environment. Configuring build...
...
YAML configuration has been written to ~/.llama/builds/local/conda/8b-instruct.yaml
Target `8b-test` built with configuration at /home/xiyan/.llama/builds/local/conda/8b-test.yaml
Build spec configuration saved at /home/xiyan/.llama/distributions/local/conda/8b-test-build.yaml
```
You can re-build package based on build config
```
$ llama stack build --config-file ~/.llama/distributions/local/conda/8b-instruct-build.yaml
Successfully setup conda environment. Configuring build...
...
...
YAML configuration has been written to ~/.llama/builds/local/conda/8b-instruct.yaml
Target `8b-test` built with configuration at /home/xiyan/.llama/builds/local/conda/8b-test.yaml
Build spec configuration saved at /home/xiyan/.llama/distributions/local/conda/8b-test-build.yaml
```
### Step 3.3: Configure a distribution
You can re-configure this distribution by running:
```
llama stack configure local --name 8b-instruct
llama stack configure --config-file ~/.llama/distributions/local/conda/8b-instruct-build.yaml
```
or
```
llama stack configure --distribution local --name 8b-instruct
```
Here is an example run of how the CLI will guide you to fill the configuration
@ -363,12 +386,12 @@ Now lets start Llama Stack Distribution Server.
You need the YAML configuration file which was written out at the end by the `llama stack build` step.
```
llama stack run local --name 8b-instruct --port 5000
llama stack run --run-config ~/.llama/builds/local/conda/8b-instruct.yaml --port 5000
```
You should see the Stack server start and print the APIs that it is supporting,
```
$ llama stack run local --name 8b-instruct --port 5000
$ llama stack run --run-config ~/.llama/builds/local/conda/8b-instruct.yaml --port 5000
> initializing model parallel with size 1
> initializing ddp with size 1