feat: use SecretStr for inference provider auth credentials

- RemoteInferenceProviderConfig now has auth_credential: SecretStr
- the default alias is api_key (most common name)
- some providers override to use api_token (RunPod, vLLM, Databricks)
- some providers exclude it (Ollama, TGI, Vertex AI)
This commit is contained in:
Matthew Farrellee 2025-10-08 05:05:05 -04:00
parent 62bac0aad4
commit 6143b9b0c3
56 changed files with 157 additions and 144 deletions

View file

@ -59,7 +59,7 @@ class OllamaInferenceAdapter(OpenAIMixin):
return self._clients[loop]
def get_api_key(self):
return "NO_KEY"
return "NO KEY REQUIRED"
def get_base_url(self):
return self.config.url.rstrip("/") + "/v1"