Provider Slug:
databricksQuick Start
Get Databricks working in 3 steps:Tip: You can also set
provider="@databricks" in Portkey() and use just model="databricks-meta-llama-3-1-70b-instruct" in the request.Add Provider in Model Catalog
- Go to Model Catalog → Add Provider
- Select Databricks
- Choose existing credentials or create new by entering your Databricks personal access token and workspace name
- Name your provider (e.g.,
databricks-prod)
Configuration Parameters
| Parameter | Description | Required |
|---|---|---|
apiKey | Databricks personal access token | Yes |
databricksWorkspace | Databricks workspace name (used to construct the URL: https://<workspace>.cloud.databricks.com) | Yes |
x-portkey-databricks-workspace header.
Complete Setup Guide →
See all setup options, code examples, and detailed instructions
Supported Endpoints
| Endpoint | Support |
|---|---|
/chat/completions | Supported |
/completions | Supported |
/embeddings | Supported |
/responses | Supported |
/messages | Supported |
Supported Features
| Feature | Support |
|---|---|
| Thinking/Reasoning | Supported via thinking and reasoning_effort parameters |
| Streaming | Supported |
Embeddings
Generate embeddings using Databricks-hosted embedding models:Responses API
Use the OpenAI Responses API format with Databricks models:Messages API
Use the Anthropic Messages API format with Databricks models:Supported Models
Databricks Model Serving supports a variety of foundation models and custom endpoints:- Meta Llama Models:
databricks-meta-llama-3-1-70b-instruct,databricks-meta-llama-3-1-405b-instruct - DBRX:
databricks-dbrx-instruct - Embedding Models:
databricks-bge-large-en,databricks-gte-large-en - Custom Endpoints: Any model deployed as a Databricks serving endpoint
Next Steps
Add Metadata
Add metadata to your Databricks requests
Gateway Configs
Add gateway configs to your Databricks requests
Tracing
Trace your Databricks requests
Fallbacks
Setup fallback strategies with Databricks
SDK Reference
Complete Portkey SDK documentation

