apache-sparkdatabricksaws-databricksdatabricks-workflows

Databricks - Tag in job Cluster


I have a doubt regarding how to tag the cluster of the job cluster in databricks via api.

I know that I can already tag a cluster and the job, but I wanted to tag the cluster of the cluster job, is this possible?

I tried to use the "jobs/update" endpoint to insert the tag in the job cluster and even inserting these fields it still gets the same error:

Example of request:

curl --location --request POST 'https://databricks.com/api/2.0/jobs/update' \
--header 'Authorization: Bearer token' \
--header 'Content-Type: application/json' \
--data-raw '{
    "job_id": 123456789,
    "new_settings": {
        "job_clusters": [
            {
                "job_cluster_key": "test",
                "new_cluster": {
                    "custom_tags": {"test": "123"}
                }
            }
        ]
    }
}'

enter image description here

I want to tag the resource (cluster) within the cluster job, is it possible via api? Has anyone performed this action?


Solution

  • If you look into Jobs API documentation, you may see that you need to provide all configuration parameters for block that you're updating, not only changed nested fields. Basically, this API allows you to change only top-level fields of the job, and do it completely, not nested fields.

    P.S. I'll ask docs team to provide clarification there.