llama-stack-mirror/tests/unit/providers/inference
Matthew Farrellee 4dbe0593f9
chore: add provider-data-api-key support to openaimixin (#3639)
# What does this PR do?

the LiteLLMOpenAIMixin provides support for reading key from provider
data (headers users send).

this adds the same functionality to the OpenAIMixin.

this is infrastructure for migrating providers.


## Test Plan

ci w/ new tests
2025-10-01 13:44:59 -07:00
..
bedrock fix: use lambda pattern for bedrock config env vars (#3307) 2025-09-05 10:45:11 +02:00
test_inference_client_caching.py chore: update the groq inference impl to use openai-python for openai-compat functions (#3348) 2025-09-06 15:36:27 -07:00
test_litellm_openai_mixin.py feat: add static embedding metadata to dynamic model listings for providers using OpenAIMixin (#3547) 2025-09-25 17:17:00 -04:00
test_openai_base_url_config.py chore: add provider-data-api-key support to openaimixin (#3639) 2025-10-01 13:44:59 -07:00
test_remote_vllm.py fix(dev): fix vllm inference recording (await models.list) (#3524) 2025-09-23 12:56:33 -04:00