llama-stack-mirror/docs/docs/providers/inference
Matthew Farrellee 0066d986c5
feat: use SecretStr for inference provider auth credentials (#3724)
# What does this PR do?

use SecretStr for OpenAIMixin providers

- RemoteInferenceProviderConfig now has auth_credential: SecretStr
- the default alias is api_key (most common name)
- some providers override to use api_token (RunPod, vLLM, Databricks)
- some providers exclude it (Ollama, TGI, Vertex AI)

addresses #3517 

## Test Plan

ci w/ new tests
2025-10-10 07:32:50 -07:00
..
index.mdx docs: API docstrings cleanup for better documentation rendering (#3661) 2025-10-06 10:46:33 -07:00
inline_meta-reference.mdx docs: provider and distro codegen migration (#3531) 2025-09-24 14:01:29 -07:00
inline_sentence-transformers.mdx docs: provider and distro codegen migration (#3531) 2025-09-24 14:01:29 -07:00
remote_anthropic.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_azure.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_bedrock.mdx feat: add refresh_models support to inference adapters (default: false) (#3719) 2025-10-07 15:19:56 +02:00
remote_cerebras.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_databricks.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_fireworks.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_gemini.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_groq.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_hf_endpoint.mdx docs: provider and distro codegen migration (#3531) 2025-09-24 14:01:29 -07:00
remote_hf_serverless.mdx docs: provider and distro codegen migration (#3531) 2025-09-24 14:01:29 -07:00
remote_llama-openai-compat.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_nvidia.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_ollama.mdx feat: add refresh_models support to inference adapters (default: false) (#3719) 2025-10-07 15:19:56 +02:00
remote_openai.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_passthrough.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_runpod.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_sambanova-openai-compat.mdx docs: provider and distro codegen migration (#3531) 2025-09-24 14:01:29 -07:00
remote_sambanova.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_tgi.mdx feat: add refresh_models support to inference adapters (default: false) (#3719) 2025-10-07 15:19:56 +02:00
remote_together.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_vertexai.mdx feat: add refresh_models support to inference adapters (default: false) (#3719) 2025-10-07 15:19:56 +02:00
remote_vllm.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
remote_watsonx.mdx feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00