llama-stack-mirror/llama_stack/providers/utils/inference
Sébastien Han c4cb6aa8d9
fix: prevent telemetry from leaking sensitive info
Prevent sensitive information from being logged in telemetry output by
assigning SecretStr type to sensitive fields. API keys, password from
KV store are now covered. All providers have been converted.

Signed-off-by: Sébastien Han <seb@redhat.com>
2025-09-29 09:54:41 +02:00
..
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
embedding_mixin.py fix: Make SentenceTransformer embedding operations non-blocking (#3335) 2025-09-04 13:58:41 -04:00
inference_store.py chore: simplify authorized sqlstore (#3496) 2025-09-19 16:13:56 -07:00
litellm_openai_mixin.py fix: prevent telemetry from leaking sensitive info 2025-09-29 09:54:41 +02:00
model_registry.py chore: prune mypy exclude list (#3561) 2025-09-26 11:44:43 -04:00
openai_compat.py feat: Add items and title to ToolParameter/ToolParamDefinition (#3003) 2025-09-27 11:35:29 -07:00
openai_mixin.py fix: prevent telemetry from leaking sensitive info 2025-09-29 09:54:41 +02:00
prompt_adapter.py chore(apis): unpublish deprecated /v1/inference apis (#3297) 2025-09-27 11:20:06 -07:00