llama-stack-mirror/docs/source/providers/inference/index.md
Eran Cohen 4c1f715171 feat: Add Google Vertex AI inference provider support
- Add new Vertex AI remote inference provider with litellm integration
- Support for Gemini models through Google Cloud Vertex AI platform
- Uses Google Cloud Application Default Credentials (ADC) for authentication
- Added VertexAI models: gemini-2.5-flash, gemini-2.5-pro, gemini-2.0-flash.
- Updated provider registry to include vertexai provider
- Add vertexai to INFERENCE_PROVIDER_IDS in starter distribution template
- Update VertexAI provider to be conditionally included in starter template when VERTEX_AI_PROJECT env var is set
- Added comprehensive documentation and sample configuration

Signed-off-by: Eran Cohen <eranco@redhat.com>
2025-08-10 16:48:01 +03:00

35 lines
552 B
Markdown

# Inference
## Overview
This section contains documentation for all available providers for the **inference** API.
## Providers
```{toctree}
:maxdepth: 1
inline_meta-reference
inline_sentence-transformers
remote_anthropic
remote_bedrock
remote_cerebras
remote_databricks
remote_fireworks
remote_gemini
remote_groq
remote_hf_endpoint
remote_hf_serverless
remote_llama-openai-compat
remote_nvidia
remote_ollama
remote_openai
remote_passthrough
remote_runpod
remote_sambanova
remote_tgi
remote_together
remote_vertexai
remote_vllm
remote_watsonx
```