mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-27 06:28:50 +00:00
feat: Add Google Vertex AI inference provider support
- Add new Vertex AI remote inference provider with litellm integration - Support for Gemini models through Google Cloud Vertex AI platform - Uses Google Cloud Application Default Credentials (ADC) for authentication - Added VertexAI models: gemini-2.5-flash, gemini-2.5-pro, gemini-2.0-flash. - Updated provider registry to include vertexai provider - Updated starter template to support Vertex AI configuration - Added comprehensive documentation and sample configuration Signed-off-by: Eran Cohen <eranco@redhat.com>
This commit is contained in:
parent
c0563c0560
commit
1f421238b8
12 changed files with 311 additions and 0 deletions
|
@ -18,6 +18,7 @@ distribution_spec:
|
|||
- remote::openai
|
||||
- remote::anthropic
|
||||
- remote::gemini
|
||||
- remote::vertexai
|
||||
- remote::groq
|
||||
- remote::llama-openai-compat
|
||||
- remote::sambanova
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue