llama-stack-mirror/llama_stack/providers
Sébastien Han 73e99b6eab
fix: add token to the openai request
OpenAIMixin expects to use an API key and creates its own AsyncOpenAI
client. So our code now authenticate with the Google service, retrieves
a token and pass it to the OpenAI client.
Falls back to an empty string if credentials can't be obtained (letting
LiteLLM handle ADC directly).

Signed-off-by: Sébastien Han <seb@redhat.com>
2025-09-10 15:17:37 +02:00
..
inline feat: Add vector_db_id to chunk metadata (#3304) 2025-09-10 13:40:27 +02:00
registry fix(deps): bump datasets versions for all providers (#3382) 2025-09-10 13:40:27 +02:00
remote fix: add token to the openai request 2025-09-10 15:17:37 +02:00
utils fix: use lambda pattern for bedrock config env vars (#3307) 2025-09-05 10:45:11 +02:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py feat: create unregister shield API endpoint in Llama Stack (#2853) 2025-08-05 07:33:46 -07:00