Reordered pip install and llama model download (#112)

Only after pip install step, llama cli command could be used (which is also specified in the notebook), so its common sense to put it before
This commit is contained in:
Karthi Keyan 2024-09-26 23:07:15 +05:30 committed by GitHub
parent 3c99f08267
commit 995a1a1d00
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -39,13 +39,7 @@
"$ docker pull llamastack/llamastack-local-gpu\n", "$ docker pull llamastack/llamastack-local-gpu\n",
"```\n", "```\n",
"\n", "\n",
"2. Download model \n", "2. pip install the llama stack client package \n",
"```\n",
"$ llama download --help \n",
"$ llama download --source meta --model-id Llama3.2-11B-Vision-Instruct --meta-url <META_URL>\n",
"```\n",
"\n",
"3. pip install the llama stack client package \n",
"For this purpose, we will directly work with pre-built docker containers and use the python SDK\n", "For this purpose, we will directly work with pre-built docker containers and use the python SDK\n",
"```\n", "```\n",
"$ git clone https://github.com/meta-llama/llama-stack-apps.git\n", "$ git clone https://github.com/meta-llama/llama-stack-apps.git\n",
@ -57,7 +51,13 @@
"$ pip install llama_stack llama_stack_client\n", "$ pip install llama_stack llama_stack_client\n",
"```\n", "```\n",
"This will install `llama_stack` and `llama_stack_client` packages. \n", "This will install `llama_stack` and `llama_stack_client` packages. \n",
"This will also enable you to use the `llama` cli. \n", "This will enable you to use the `llama` cli. \n",
"\n",
"3. Download model \n",
"```\n",
"$ llama download --help \n",
"$ llama download --source meta --model-id Llama3.2-11B-Vision-Instruct --meta-url <META_URL>\n",
"```\n",
"\n", "\n",
"4. Configure the Stack Server\n", "4. Configure the Stack Server\n",
"```\n", "```\n",