llama-stack-mirror/docs/docs/providers/inference/remote_anthropic.mdx
Matthew Farrellee ce77c27ff8
chore: use remoteinferenceproviderconfig for remote inference providers (#3668)
# What does this PR do?

on the path to maintainable impls of inference providers. make all
configs instances of RemoteInferenceProviderConfig.

## Test Plan

ci
2025-10-03 08:48:42 -07:00

24 lines
709 B
Text

---
description: "Anthropic inference provider for accessing Claude models and Anthropic's AI services."
sidebar_label: Remote - Anthropic
title: remote::anthropic
---
# remote::anthropic
## Description
Anthropic inference provider for accessing Claude models and Anthropic's AI services.
## Configuration
| Field | Type | Required | Default | Description |
|-------|------|----------|---------|-------------|
| `allowed_models` | `list[str \| None` | No | | List of models that should be registered with the model registry. If None, all models are allowed. |
| `api_key` | `str \| None` | No | | API key for Anthropic models |
## Sample Configuration
```yaml
api_key: ${env.ANTHROPIC_API_KEY:=}
```