llama-stack-mirror/llama_stack/providers/remote/inference
Matthew Farrellee 0e27016cf2
chore: update the vertexai inference impl to use openai-python for openai-compat functions (#3377)
# What does this PR do?

update VertexAI inference provider to use openai-python for
openai-compat functions

## Test Plan

```
$ VERTEX_AI_PROJECT=... uv run llama stack build --image-type venv --providers inference=remote::vertexai --run
...
$ LLAMA_STACK_CONFIG=http://localhost:8321 uv run --group test pytest -v -ra --text-model vertexai/vertex_ai/gemini-2.5-flash tests/integration/inference/test_openai_completion.py
...
```

i don't have an account to test this. `get_api_key` may also need to be
updated per
https://cloud.google.com/vertex-ai/generative-ai/docs/start/openai

---------

Signed-off-by: Sébastien Han <seb@redhat.com>
Co-authored-by: Sébastien Han <seb@redhat.com>
2025-09-10 15:39:29 +02:00
..
anthropic chore: update the anthropic inference impl to use openai-python for openai-compat functions (#3366) 2025-09-07 14:00:42 -07:00
bedrock feat(starter)!: simplify starter distro; litellm model registry changes (#2916) 2025-07-25 15:02:04 -07:00
cerebras feat(starter)!: simplify starter distro; litellm model registry changes (#2916) 2025-07-25 15:02:04 -07:00
databricks feat(starter)!: simplify starter distro; litellm model registry changes (#2916) 2025-07-25 15:02:04 -07:00
fireworks refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
gemini chore: update the gemini inference impl to use openai-python for openai-compat functions (#3351) 2025-09-06 12:22:20 -07:00
groq chore: update the groq inference impl to use openai-python for openai-compat functions (#3348) 2025-09-06 15:36:27 -07:00
llama_openai_compat chore: indicate to mypy that InferenceProvider.rerank is concrete (#3238) 2025-08-22 12:02:13 -07:00
nvidia docs: add VLM NIM example (#3277) 2025-08-29 16:23:52 -07:00
ollama feat(tests): auto-merge all model list responses and unify recordings (#3320) 2025-09-03 11:33:03 -07:00
openai refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
passthrough chore(rename): move llama_stack.distribution to llama_stack.core (#2975) 2025-07-30 23:30:53 -07:00
runpod ci: test safety with starter (#2628) 2025-07-09 16:53:50 +02:00
sambanova chore: update the sambanova inference impl to use openai-python for openai-compat functions (#3345) 2025-09-06 12:25:13 -07:00
tgi refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
together refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00
vertexai chore: update the vertexai inference impl to use openai-python for openai-compat functions (#3377) 2025-09-10 15:39:29 +02:00
vllm chore: indicate to mypy that InferenceProvider.batch_completion/batch_chat_completion is concrete (#3239) 2025-08-22 14:17:30 -07:00
watsonx chore(python-deps): replace ibm_watson_machine_learning with ibm_watsonx_ai (#3302) 2025-09-03 11:33:35 +02:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00