llama-stack-mirror/docs/docs/getting_started
Costa Shulyupin 2b85600a7e
docs: make inference model configurable (#4385)
Allow users to specify the inference model through the INFERENCE_MODEL
environment variable instead of hardcoding it, with fallback to
ollama/llama3.2:3b if not set.

Signed-off-by: Costa Shulyupin <costa.shul@redhat.com>

Signed-off-by: Costa Shulyupin <costa.shul@redhat.com>
2025-12-15 11:02:28 +01:00
..
demo_script.py docs: make inference model configurable (#4385) 2025-12-15 11:02:28 +01:00
detailed_tutorial.mdx feat: remove usage of build yaml (#4192) 2025-12-10 10:12:12 +01:00
libraries.mdx docs: concepts and building_applications migration (#3534) 2025-09-24 14:05:30 -07:00
quickstart.mdx feat: Add static file import system for docs (#3882) 2025-10-24 14:01:33 -04:00