llama-stack-mirror/llama_stack/providers
Matthew Farrellee d6c3b36390
chore: update the gemini inference impl to use openai-python for openai-compat functions (#3351)
# What does this PR do?

update the Gemini inference provider to use openai-python for the
openai-compat endpoints

partially addresses #3349, does not address /inference/completion or
/inference/chat-completion

## Test Plan

ci
2025-09-06 12:22:20 -07:00
..
inline feat: Updating Rag Tool to use Files API and Vector Stores API (#3344) 2025-09-06 07:26:34 -06:00
registry chore: update the gemini inference impl to use openai-python for openai-compat functions (#3351) 2025-09-06 12:22:20 -07:00
remote chore: update the gemini inference impl to use openai-python for openai-compat functions (#3351) 2025-09-06 12:22:20 -07:00
utils fix: use lambda pattern for bedrock config env vars (#3307) 2025-09-05 10:45:11 +02:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00