forked from phoenix-oss/llama-stack-mirror
docs: Correct typos in Zero to Hero guide (#997)
# What does this PR do? <!-- Provide a short summary of what this PR does and why. Usually, the relevant context should be present in a linked issue. --> Corrects some typographical errors found in the `docs/zero_to_hero_guide/README.md` file. <!-- Uncomment this section with the issue number if an issue is being resolved **Issue resolved by this Pull Request:** Closes # ---> ## Test Plan <!-- Please describe: - tests you ran to verify your changes with result summaries. - provide instructions so it can be reproduced. --> N/A <!-- ## Sources Please link relevant resources if necessary. --> <!-- ## Documentation - [ ] Added a [Changelog](https://github.com/meta-llama/llama-stack/blob/main/CHANGELOG.md) entry if the change is significant (new feature, breaking change etc.). --> Co-authored-by: Maxime Lecanu <mlecanu@fb.com>
This commit is contained in:
parent
a84e7669f0
commit
e964ec95e9
1 changed files with 3 additions and 3 deletions
|
@ -45,7 +45,7 @@ If you're looking for more specific topics, we have a [Zero to Hero Guide](#next
|
|||
|
||||
---
|
||||
|
||||
## Install Dependencies and Set Up Environmen
|
||||
## Install Dependencies and Set Up Environment
|
||||
|
||||
1. **Create a Conda Environment**:
|
||||
Create a new Conda environment with Python 3.10:
|
||||
|
@ -110,7 +110,7 @@ If you're looking for more specific topics, we have a [Zero to Hero Guide](#next
|
|||
--env SAFETY_MODEL=$SAFETY_MODEL
|
||||
--env OLLAMA_URL=$OLLAMA_URL
|
||||
```
|
||||
Note: Everytime you run a new model with `ollama run`, you will need to restart the llama stack. Otherwise it won't see the new model.
|
||||
Note: Every time you run a new model with `ollama run`, you will need to restart the llama stack. Otherwise it won't see the new model.
|
||||
|
||||
The server will start and listen on `http://localhost:5001`.
|
||||
|
||||
|
@ -191,7 +191,7 @@ You can check the available models with the command `llama-stack-client models l
|
|||
|
||||
You can also interact with the Llama Stack server using a simple Python script. Below is an example:
|
||||
|
||||
### 1. Activate Conda Environmen
|
||||
### 1. Activate Conda Environment
|
||||
|
||||
```bash
|
||||
conda activate ollama
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue