llama-stack-mirror/docs/source/providers/inference/remote_databricks.md
Matthew Farrellee d07ebce4d9
feat: (re-)enable Databricks inference adapter (#3500)
# What does this PR do?

add/enable the Databricks inference adapter

Databricks inference adapter was broken, closes #3486 

- remove deprecated completion / chat_completion endpoints
- enable dynamic model listing w/o refresh, listing is not async
- use SecretStr instead of str for token
- backward incompatible change: for consistency with databricks docs,
env DATABRICKS_URL -> DATABRICKS_HOST and DATABRICKS_API_TOKEN ->
DATABRICKS_TOKEN
- databricks urls are custom per user/org, add special recorder handling
for databricks urls
- add integration test --setup databricks
- enable chat completions tests
- enable embeddings tests
- disable n > 1 tests
- disable embeddings base64 tests
- disable embeddings dimensions tests

note: reasoning models, e.g. gpt oss, fail because databricks has a
custom, incompatible response format

## Test Plan

ci and 

```
./scripts/integration-tests.sh --stack-config server:ci-tests --setup databricks --subdirs inference --pattern openai
```

note: databricks needs to be manually added to the ci-tests distro for
replay testing
2025-09-23 15:37:23 -04:00

538 B

remote::databricks

Description

Databricks inference provider for running models on Databricks' unified analytics platform.

Configuration

Field Type Required Default Description
url <class 'str'> No The URL for the Databricks model serving endpoint
api_token <class 'pydantic.types.SecretStr'> No The Databricks API token

Sample Configuration

url: ${env.DATABRICKS_HOST:=}
api_token: ${env.DATABRICKS_TOKEN:=}