llama-stack-mirror/llama_stack/providers/utils/bedrock
Matthew Farrellee 0066d986c5
feat: use SecretStr for inference provider auth credentials (#3724)
# What does this PR do?

use SecretStr for OpenAIMixin providers

- RemoteInferenceProviderConfig now has auth_credential: SecretStr
- the default alias is api_key (most common name)
- some providers override to use api_token (RunPod, vLLM, Databricks)
- some providers exclude it (Ollama, TGI, Vertex AI)

addresses #3517 

## Test Plan

ci w/ new tests
2025-10-10 07:32:50 -07:00
..
__init__.py add missing init file 2024-12-17 11:49:03 -08:00
client.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
config.py feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
refreshable_boto_session.py feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30