mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-07 14:26:44 +00:00
# What does this PR do? We were not using conditionals correctly, conditionals can only be used when the env variable is set, so `${env.ENVIRONMENT:+}` would return None is ENVIRONMENT is not set. If you want to create a conditional value, you need to do `${env.ENVIRONMENT:=}`, this will pick the value of ENVIRONMENT if set, otherwise will return None. Closes: https://github.com/meta-llama/llama-stack/issues/2564 Signed-off-by: Sébastien Han <seb@redhat.com>
886 B
886 B
remote::nvidia
Description
NVIDIA inference provider for accessing NVIDIA NIM models and AI services.
Configuration
Field | Type | Required | Default | Description |
---|---|---|---|---|
url |
<class 'str'> |
No | https://integrate.api.nvidia.com | A base url for accessing the NVIDIA NIM |
api_key |
pydantic.types.SecretStr | None |
No | The NVIDIA API key, only needed of using the hosted service | |
timeout |
<class 'int'> |
No | 60 | Timeout for the HTTP requests |
append_api_version |
<class 'bool'> |
No | True | When set to false, the API version will not be appended to the base_url. By default, it is true. |
Sample Configuration
url: ${env.NVIDIA_BASE_URL:=https://integrate.api.nvidia.com}
api_key: ${env.NVIDIA_API_KEY:=}
append_api_version: ${env.NVIDIA_APPEND_API_VERSION:=True}