From e777d965a14dedff13c2eeb20e88c5a56ebe79b2 Mon Sep 17 00:00:00 2001 From: Nathan Weinberg <31703736+nathan-weinberg@users.noreply.github.com> Date: Wed, 5 Feb 2025 23:57:51 -0500 Subject: [PATCH] docs: add addn server guidance for Linux users in Quick Start (#972) # What does this PR do? - [x] Addresses issue #971 ## Test Plan Ran docs build locally ## Sources See discussion linked in the issue ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Ran pre-commit to handle lint / formatting issues. - [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md), Pull Request section? - [ ] Updated relevant documentation. - [ ] Wrote necessary unit or integration tests. Signed-off-by: Nathan Weinberg Co-authored-by: Mert Parker --- docs/source/getting_started/index.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/docs/source/getting_started/index.md b/docs/source/getting_started/index.md index 01cef5400..d8bf42533 100644 --- a/docs/source/getting_started/index.md +++ b/docs/source/getting_started/index.md @@ -66,6 +66,23 @@ As another example, to start the container with Podman, you can do the same but Configuration for this is available at `distributions/ollama/run.yaml`. +```{admonition} Note +:class: note + +Docker containers run in their own isolated network namespaces on Linux. To allow the container to communicate with services running on the host via `localhost`, you need `--network=host`. This makes the container use the host’s network directly so it can connect to Ollama running on `localhost:11434`. + +Linux users having issues running the above command should instead try the following: +```bash +docker run -it \ + -p $LLAMA_STACK_PORT:$LLAMA_STACK_PORT \ + -v ~/.llama:/root/.llama \ + --network=host \ + llamastack/distribution-ollama \ + --port $LLAMA_STACK_PORT \ + --env INFERENCE_MODEL=$INFERENCE_MODEL \ + --env OLLAMA_URL=http://localhost:11434 +``` + :::