mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-11 13:44:38 +00:00
# What does this PR do? use SecretStr for OpenAIMixin providers - RemoteInferenceProviderConfig now has auth_credential: SecretStr - the default alias is api_key (most common name) - some providers override to use api_token (RunPod, vLLM, Databricks) - some providers exclude it (Ollama, TGI, Vertex AI) addresses #3517 ## Test Plan ci w/ new tests |
||
---|---|---|
.. | ||
github | ||
telemetry | ||
check-init-py.sh | ||
check-workflows-use-hashes.sh | ||
distro_codegen.py | ||
gen-changelog.py | ||
gen-ci-docs.py | ||
generate_prompt_format.py | ||
get_setup_env.py | ||
install.sh | ||
integration-tests.sh | ||
normalize_recordings.py | ||
provider_codegen.py | ||
run-ui-linter.sh | ||
unit-tests.sh |