Azure Databricks · Rate Limits
Azure Databricks Rate Limits
Azure Databricks publishes detailed per-API rate limits at the workspace and account level for Jobs, Workspace, MLflow, SCIM, and other endpoints. Most limits are fixed; resource limits (clusters, jobs, MLflow runs, Vector Search) are also enforced. Throttled requests return HTTP 429. Some limits are negotiable through your Databricks account team.
23 Limits
Throttle: 429
AnalyticsBig DataLakehouseSparkRate Limiting
Limits
DBFS API workspace
30
Jobs API - create workspace
20
Jobs API - run-now workspace
20
Jobs API - runs/get workspace
100
Jobs API - runs/submit workspace
35
Account SCIM API - GET account
20
Account SCIM API - LIST account
240
Account SCIM API - PATCH account
2
Account SCIM API - POST/PUT/DELETE account
5
Workspace SCIM API - GET workspace
255
Permissions API - GET workspace
100
Permissions API - PATCH/PUT workspace
30
Secrets API workspace
1100
Token Management API workspace
40
MLflow Tracking API (most endpoints) workspace
120
MLflow search endpoints workspace
7
Workspace export API workspace
60
Workspace import API workspace
30
Genie API (free tier) workspace
5
Best-effort across all Genie spaces; raise via Databricks account team.
Genie UI questions workspace
20
MCP servers (managed UC/VectorSearch/Genie) workspace
50
MLflow Trace creation workspace
50
Model Serving throughput target endpoint
200
Soft target; not guaranteed.
Policies
Backoff
Honor Retry-After on 429 responses; use exponential backoff with jitter.
Limit raise via account team
Many limits marked "Fixed=No" can be raised by contacting your Azure Databricks account team.
Workspace vs account scope
Some limits are workspace-scoped, others account-scoped; check the per-API table.
Resource limits separate from rate limits
Azure Databricks also enforces resource caps (clusters, MLflow runs, Vector Search indexes, Unity Catalog objects) that are not API rate limits but block create operations once exceeded.