llama-stack-mirror/docs/source
Dmitry Rogozhkin 241a42bb26 docs: add example for intel gpu in vllm remote
PR adds instructions to setup vLLM remote endpoint for vllm-remote
llama stack distribution.

* Verified with manual tests of the configured vllm-remote against vllm
  endpoint running on the system with Intel GPU
* Also verified with ci pytests (see cmdline below). Test passes in the
  same capacity as it does on the A10 Nvidia setup (some tests do fail which
  seems to be known issues with vllm remote llama stack distribution)

```
pytest -s -v tests/integration/inference/test_text_inference.py \
   --stack-config=http://localhost:5001 \
   --text-model=meta-llama/Llama-3.2-3B-Instruct
```

Signed-off-by: Dmitry Rogozhkin <dmitry.v.rogozhkin@intel.com>
2025-04-15 07:15:37 -07:00
..
building_applications docs: Minor updates to docs to make them a little friendlier to new users (#1871) 2025-04-04 08:10:35 -04:00
concepts docs: fix typos in evaluation concepts (#1745) 2025-03-21 12:00:53 -07:00
contributing docs: Updating docs to source from CONTRIBUTING.md (#1850) 2025-04-01 14:50:04 +02:00
distributions docs: add example for intel gpu in vllm remote 2025-04-15 07:15:37 -07:00
getting_started docs: Update quickstart page to structure things a little more for the novices (#1873) 2025-04-10 14:09:00 -07:00
introduction docs: Remove mentions of focus on Llama models (#1690) 2025-03-19 00:17:22 -04:00
playground chore: simplify running the demo UI (#1907) 2025-04-09 11:22:29 -07:00
providers docs: Update quickstart page to structure things a little more for the novices (#1873) 2025-04-10 14:09:00 -07:00
references feat(api): (1/n) datasets api clean up (#1573) 2025-03-17 16:55:45 -07:00
conf.py chore: Detect browser setting for dark/light mode and set default to light mode (#1913) 2025-04-09 12:40:56 -04:00
index.md docs: Update quickstart page to structure things a little more for the novices (#1873) 2025-04-10 14:09:00 -07:00