llama-stack-mirror/tests/unit/providers/inference
Sébastien Han c4cb6aa8d9
fix: prevent telemetry from leaking sensitive info
Prevent sensitive information from being logged in telemetry output by
assigning SecretStr type to sensitive fields. API keys, password from
KV store are now covered. All providers have been converted.

Signed-off-by: Sébastien Han <seb@redhat.com>
2025-09-29 09:54:41 +02:00
..
bedrock fix: use lambda pattern for bedrock config env vars (#3307) 2025-09-05 10:45:11 +02:00
test_inference_client_caching.py fix: prevent telemetry from leaking sensitive info 2025-09-29 09:54:41 +02:00
test_litellm_openai_mixin.py fix: prevent telemetry from leaking sensitive info 2025-09-29 09:54:41 +02:00
test_openai_base_url_config.py fix: prevent telemetry from leaking sensitive info 2025-09-29 09:54:41 +02:00
test_remote_vllm.py fix(dev): fix vllm inference recording (await models.list) (#3524) 2025-09-23 12:56:33 -04:00