From 2d9c8b549e0f0632c4dfa73a80db96697f34abb8 Mon Sep 17 00:00:00 2001 From: Ryan Cook Date: Wed, 5 Feb 2025 17:06:39 -0500 Subject: [PATCH] docs: missing T in import (#974) # What does this PR do? Missing T in import ## Test Plan N/A doc update ## Sources Please link relevant resources if necessary. ## Before submitting - [X ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Ran pre-commit to handle lint / formatting issues. - [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md), Pull Request section? - [ ] Updated relevant documentation. - [ ] Wrote necessary unit or integration tests. --- docs/zero_to_hero_guide/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/zero_to_hero_guide/README.md b/docs/zero_to_hero_guide/README.md index d4462febf..4a09a7e62 100644 --- a/docs/zero_to_hero_guide/README.md +++ b/docs/zero_to_hero_guide/README.md @@ -208,7 +208,7 @@ In `test_llama_stack.py`, write the following code: ```python import os -from llama_stack_client import LlamaStackClien +from llama_stack_client import LlamaStackClient # Get the model ID from the environment variable INFERENCE_MODEL = os.environ.get("INFERENCE_MODEL")