name: local-ollama-conda-example distribution_spec: distribution_type: local description: Use code from `llama_toolchain` itself to serve all llama stack APIs docker_image: null providers: inference: remote::ollama memory: meta-reference-faiss safety: meta-reference agentic_system: meta-reference telemetry: console image_type: conda