llama-stack-mirror/docs/source
Eran Cohen 1f421238b8 feat: Add Google Vertex AI inference provider support
- Add new Vertex AI remote inference provider with litellm integration
- Support for Gemini models through Google Cloud Vertex AI platform
- Uses Google Cloud Application Default Credentials (ADC) for authentication
- Added VertexAI models: gemini-2.5-flash, gemini-2.5-pro, gemini-2.0-flash.
- Updated provider registry to include vertexai provider
- Updated starter template to support Vertex AI configuration
- Added comprehensive documentation and sample configuration

Signed-off-by: Eran Cohen <eranco@redhat.com>
2025-07-24 09:49:23 +03:00
..
advanced_apis docs: Reorganize documentation on the webpage (#2651) 2025-07-15 14:19:35 -07:00
building_applications docs: Reorganize documentation on the webpage (#2651) 2025-07-15 14:19:35 -07:00
concepts docs: add missing bold title to match others (#2782) 2025-07-16 18:05:48 +02:00
contributing docs: revamp testing documentation (#2155) 2025-05-13 11:28:29 -07:00
deploying chore: update k8s template (#2786) 2025-07-16 15:07:26 -07:00
distributions docs: add virtualenv instructions for running starter distro (#2780) 2025-07-18 09:07:43 -07:00
getting_started chore: Adding demo script and importing it into the docs (#2848) 2025-07-21 22:53:32 -04:00
providers feat: Add Google Vertex AI inference provider support 2025-07-24 09:49:23 +03:00
references docs: update outdated llama stack client documentation (#2758) 2025-07-15 11:49:59 -07:00
conf.py docs: Reorganize documentation on the webpage (#2651) 2025-07-15 14:19:35 -07:00
index.md docs: Reorganize documentation on the webpage (#2651) 2025-07-15 14:19:35 -07:00