llama-stack-mirror/docs/docs
Matthew Farrellee 0066d986c5
feat: use SecretStr for inference provider auth credentials (#3724)
# What does this PR do?

use SecretStr for OpenAIMixin providers

- RemoteInferenceProviderConfig now has auth_credential: SecretStr
- the default alias is api_key (most common name)
- some providers override to use api_token (RunPod, vLLM, Databricks)
- some providers exclude it (Ollama, TGI, Vertex AI)

addresses #3517 

## Test Plan

ci w/ new tests
2025-10-10 07:32:50 -07:00
..
advanced_apis chore!: remove --env from llama stack run (#3711) 2025-10-07 20:58:15 -07:00
building_applications chore!: remove --env from llama stack run (#3711) 2025-10-07 20:58:15 -07:00
concepts chore: use uvicorn to start llama stack server everywhere (#3625) 2025-10-06 14:27:40 +02:00
contributing feat(tests): make inference_recorder into api_recorder (include tool_invoke) (#3403) 2025-10-09 14:27:51 -07:00
deploying chore: use uvicorn to start llama stack server everywhere (#3625) 2025-10-06 14:27:40 +02:00
distributions chore!: remove model mgmt from CLI for Hugging Face CLI (#3700) 2025-10-09 16:50:33 -07:00
getting_started chore!: remove --env from llama stack run (#3711) 2025-10-07 20:58:15 -07:00
providers feat: use SecretStr for inference provider auth credentials (#3724) 2025-10-10 07:32:50 -07:00
references chore!: remove model mgmt from CLI for Hugging Face CLI (#3700) 2025-10-09 16:50:33 -07:00
api-overview.md docs: api separation (#3630) 2025-10-01 10:13:31 -07:00
index.mdx docs: fix more broken links (#3649) 2025-10-02 10:43:49 +02:00