Beta ProductSQL Proxy is currently in beta. API endpoints may change.
Jobs compute configuration controls how SQL Proxy routes queries to Databricks Jobs compute when using the @datafold:jobs_compute directive.
Get Configuration
GET /admin/config/jobs-compute
curl -X GET \
-H "Authorization: Bearer <admin-token>" \
https://sqlproxy.your-company.datafold.com/admin/config/jobs-compute
Update Configuration (Full Replace)
PUT /admin/config/jobs-compute
curl -X PUT \
-H "Authorization: Bearer <admin-token>" \
-H "Content-Type: application/json" \
-d '{
"enabled": true,
"poll_interval_ms": 5000,
"max_wait_ms": 3600000,
"executor_script_path": "/dbfs/datafold/executor.py",
"default_spark_version": "14.3.x-scala2.12",
"default_node_type": "i3.xlarge",
"default_num_workers": 4
}' \
https://sqlproxy.your-company.datafold.com/admin/config/jobs-compute
Request Body
| Field | Type | Required | Description |
|---|
enabled | boolean | Yes | Enable/disable jobs compute routing |
poll_interval_ms | integer | Yes | Polling interval for job status (milliseconds) |
max_wait_ms | integer | Yes | Maximum wait time for job completion (milliseconds) |
executor_script_path | string | Yes | DBFS path to the executor script |
default_spark_version | string | Yes | Default Databricks runtime version |
default_node_type | string | Yes | Default EC2 instance type for workers |
default_num_workers | integer | Yes | Default number of workers |
Update Configuration (Partial)
PATCH /admin/config/jobs-compute
All fields are optional. Only provided fields are updated.
curl -X PATCH \
-H "Authorization: Bearer <admin-token>" \
-H "Content-Type: application/json" \
-d '{"default_num_workers": 8}' \
https://sqlproxy.your-company.datafold.com/admin/config/jobs-compute