chore: update docs for telemetry api removal (#3900)

# What does this PR do?
Telemetry is no longer an API/provider.

## Test Plan
This commit is contained in:
ehhuang 2025-10-24 13:57:28 -07:00 committed by GitHub
parent 4566eebe05
commit 2a1a813308
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
15 changed files with 39 additions and 102 deletions

View file

@ -21,7 +21,6 @@ apis:
- inference
- vector_io
- safety
- telemetry
providers:
inference:
- provider_id: ollama
@ -51,10 +50,6 @@ providers:
responses:
backend: sql_default
table_name: responses
telemetry:
- provider_id: meta-reference
provider_type: inline::meta-reference
config: {}
storage:
backends:
kv_default:
@ -92,7 +87,6 @@ apis:
- inference
- vector_io
- safety
- telemetry
```
## Providers
@ -589,24 +583,13 @@ created by users sharing a team with them:
In addition to resource-based access control, Llama Stack supports endpoint-level authorization using OAuth 2.0 style scopes. When authentication is enabled, specific API endpoints require users to have particular scopes in their authentication token.
**Scope-Gated APIs:**
The following APIs are currently gated by scopes:
- **Telemetry API** (scope: `telemetry.read`):
- `POST /telemetry/traces` - Query traces
- `GET /telemetry/traces/{trace_id}` - Get trace by ID
- `GET /telemetry/traces/{trace_id}/spans/{span_id}` - Get span by ID
- `POST /telemetry/spans/{span_id}/tree` - Get span tree
- `POST /telemetry/spans` - Query spans
- `POST /telemetry/metrics/{metric_name}` - Query metrics
**Authentication Configuration:**
For **JWT/OAuth2 providers**, scopes should be included in the JWT's claims:
```json
{
"sub": "user123",
"scope": "telemetry.read",
"scope": "<scope>",
"aud": "llama-stack"
}
```
@ -616,7 +599,7 @@ For **custom authentication providers**, the endpoint must return user attribute
{
"principal": "user123",
"attributes": {
"scopes": ["telemetry.read"]
"scopes": ["<scope>"]
}
}
```

View file

@ -2,10 +2,10 @@
Remote-Hosted distributions are available endpoints serving Llama Stack API that you can directly connect to.
| Distribution | Endpoint | Inference | Agents | Memory | Safety | Telemetry |
| Distribution | Endpoint | Inference | Agents | Memory | Safety |
|-------------|----------|-----------|---------|---------|---------|------------|
| Together | [https://llama-stack.together.ai](https://llama-stack.together.ai) | remote::together | meta-reference | remote::weaviate | meta-reference | meta-reference |
| Fireworks | [https://llamastack-preview.fireworks.ai](https://llamastack-preview.fireworks.ai) | remote::fireworks | meta-reference | remote::weaviate | meta-reference | meta-reference |
| Together | [https://llama-stack.together.ai](https://llama-stack.together.ai) | remote::together | meta-reference | remote::weaviate | meta-reference |
| Fireworks | [https://llamastack-preview.fireworks.ai](https://llamastack-preview.fireworks.ai) | remote::fireworks | meta-reference | remote::weaviate | meta-reference |
## Connecting to Remote-Hosted Distributions

View file

@ -21,7 +21,6 @@ The `llamastack/distribution-watsonx` distribution consists of the following pro
| inference | `remote::watsonx`, `inline::sentence-transformers` |
| safety | `inline::llama-guard` |
| scoring | `inline::basic`, `inline::llm-as-judge`, `inline::braintrust` |
| telemetry | `inline::meta-reference` |
| tool_runtime | `remote::brave-search`, `remote::tavily-search`, `inline::rag-runtime`, `remote::model-context-protocol` |
| vector_io | `inline::faiss` |

View file

@ -13,9 +13,9 @@ self
The `llamastack/distribution-tgi` distribution consists of the following provider configurations.
| **API** | **Inference** | **Agents** | **Memory** | **Safety** | **Telemetry** |
|----------------- |--------------- |---------------- |-------------------------------------------------- |---------------- |---------------- |
| **Provider(s)** | remote::tgi | meta-reference | meta-reference, remote::pgvector, remote::chroma | meta-reference | meta-reference |
| **API** | **Inference** | **Agents** | **Memory** | **Safety** |
|----------------- |--------------- |---------------- |-------------------------------------------------- |---------------- |
| **Provider(s)** | remote::tgi | meta-reference | meta-reference, remote::pgvector, remote::chroma | meta-reference |
The only difference vs. the `tgi` distribution is that it runs the Dell-TGI server for inference.

View file

@ -22,7 +22,6 @@ The `llamastack/distribution-dell` distribution consists of the following provid
| inference | `remote::tgi`, `inline::sentence-transformers` |
| safety | `inline::llama-guard` |
| scoring | `inline::basic`, `inline::llm-as-judge`, `inline::braintrust` |
| telemetry | `inline::meta-reference` |
| tool_runtime | `remote::brave-search`, `remote::tavily-search`, `inline::rag-runtime` |
| vector_io | `inline::faiss`, `remote::chromadb`, `remote::pgvector` |

View file

@ -21,7 +21,6 @@ The `llamastack/distribution-passthrough` distribution consists of the following
| inference | `remote::passthrough`, `inline::sentence-transformers` |
| safety | `inline::llama-guard` |
| scoring | `inline::basic`, `inline::llm-as-judge`, `inline::braintrust` |
| telemetry | `inline::meta-reference` |
| tool_runtime | `remote::brave-search`, `remote::tavily-search`, `remote::wolfram-alpha`, `inline::rag-runtime`, `remote::model-context-protocol` |
| vector_io | `inline::faiss`, `remote::chromadb`, `remote::pgvector` |

View file

@ -26,7 +26,6 @@ The starter distribution consists of the following provider configurations:
| inference | `remote::openai`, `remote::fireworks`, `remote::together`, `remote::ollama`, `remote::anthropic`, `remote::gemini`, `remote::groq`, `remote::sambanova`, `remote::vllm`, `remote::tgi`, `remote::cerebras`, `remote::llama-openai-compat`, `remote::nvidia`, `remote::hf::serverless`, `remote::hf::endpoint`, `inline::sentence-transformers` |
| safety | `inline::llama-guard` |
| scoring | `inline::basic`, `inline::llm-as-judge`, `inline::braintrust` |
| telemetry | `inline::meta-reference` |
| tool_runtime | `remote::brave-search`, `remote::tavily-search`, `inline::rag-runtime`, `remote::model-context-protocol` |
| vector_io | `inline::faiss`, `inline::sqlite-vec`, `inline::milvus`, `remote::chromadb`, `remote::pgvector` |
@ -119,7 +118,7 @@ The following environment variables can be configured:
### Telemetry Configuration
- `OTEL_SERVICE_NAME`: OpenTelemetry service name
- `TELEMETRY_SINKS`: Telemetry sinks (default: `[]`)
- `OTEL_EXPORTER_OTLP_ENDPOINT`: OpenTelemetry collector endpoint URL
## Enabling Providers