forked from phoenix-oss/llama-stack-mirror
Reordered pip install and llama model download (#112)
Only after pip install step, llama cli command could be used (which is also specified in the notebook), so its common sense to put it before
This commit is contained in:
parent
3c99f08267
commit
995a1a1d00
1 changed files with 8 additions and 8 deletions
|
@ -39,13 +39,7 @@
|
|||
"$ docker pull llamastack/llamastack-local-gpu\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"2. Download model \n",
|
||||
"```\n",
|
||||
"$ llama download --help \n",
|
||||
"$ llama download --source meta --model-id Llama3.2-11B-Vision-Instruct --meta-url <META_URL>\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"3. pip install the llama stack client package \n",
|
||||
"2. pip install the llama stack client package \n",
|
||||
"For this purpose, we will directly work with pre-built docker containers and use the python SDK\n",
|
||||
"```\n",
|
||||
"$ git clone https://github.com/meta-llama/llama-stack-apps.git\n",
|
||||
|
@ -57,7 +51,13 @@
|
|||
"$ pip install llama_stack llama_stack_client\n",
|
||||
"```\n",
|
||||
"This will install `llama_stack` and `llama_stack_client` packages. \n",
|
||||
"This will also enable you to use the `llama` cli. \n",
|
||||
"This will enable you to use the `llama` cli. \n",
|
||||
"\n",
|
||||
"3. Download model \n",
|
||||
"```\n",
|
||||
"$ llama download --help \n",
|
||||
"$ llama download --source meta --model-id Llama3.2-11B-Vision-Instruct --meta-url <META_URL>\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"4. Configure the Stack Server\n",
|
||||
"```\n",
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue