mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-15 02:52:37 +00:00
- RemoteInferenceProviderConfig now has auth_credential: SecretStr - the default alias is api_key (most common name) - some providers override to use api_token (RunPod, vLLM, Databricks) - some providers exclude it (Ollama, TGI, Vertex AI) |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| config.py | ||
| fireworks.py | ||