llama-stack/docs/source
Yuan Tang f5a5c5d459
docs: Add instruction on enabling tool calling for remote vLLM (#1719)
# What does this PR do?

This PR adds a link to tool calling instructions in vLLM. Users have
asked about this many times, e.g.
https://github.com/meta-llama/llama-stack/issues/1648#issuecomment-2740642077

---------

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-03-20 15:18:17 -07:00
..
building_applications feat(api): simplify client imports (#1687) 2025-03-20 10:15:49 -07:00
concepts Web updates to point to latest releases for Mobile SDK (#1650) 2025-03-14 17:06:07 -07:00
contributing chore: consolidate scripts under ./scripts directory (#1646) 2025-03-17 17:56:30 -04:00
distributions docs: Add instruction on enabling tool calling for remote vLLM (#1719) 2025-03-20 15:18:17 -07:00
getting_started feat(api): simplify client imports (#1687) 2025-03-20 10:15:49 -07:00
introduction docs: Remove mentions of focus on Llama models (#1690) 2025-03-19 00:17:22 -04:00
playground feat: Created Playground Containerfile and Image Workflow (#1256) 2025-03-18 09:26:49 -07:00
providers feat: Qdrant inline provider (#1273) 2025-03-18 14:04:21 -07:00
references feat(api): (1/n) datasets api clean up (#1573) 2025-03-17 16:55:45 -07:00
conf.py fix: fetched latest pypi version when building documentation 2025-03-06 21:15:15 -08:00
index.md docs: Remove mentions of focus on Llama models (#1690) 2025-03-19 00:17:22 -04:00