mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-07-06 22:10:41 +00:00
# What does this PR do? We were not using conditionals correctly, conditionals can only be used when the env variable is set, so `${env.ENVIRONMENT:+}` would return None is ENVIRONMENT is not set. If you want to create a conditional value, you need to do `${env.ENVIRONMENT:=}`, this will pick the value of ENVIRONMENT if set, otherwise will return None. Closes: https://github.com/meta-llama/llama-stack/issues/2564 Signed-off-by: Sébastien Han <seb@redhat.com>
474 B
474 B
remote::runpod
Description
RunPod inference provider for running models on RunPod's cloud GPU platform.
Configuration
Field | Type | Required | Default | Description |
---|---|---|---|---|
url |
str | None |
No | The URL for the Runpod model serving endpoint | |
api_token |
str | None |
No | The API token |
Sample Configuration
url: ${env.RUNPOD_URL:=}
api_token: ${env.RUNPOD_API_TOKEN:=}