azureterraformdatabricksterraform-provider-databricks

Adding multiple spark_conf lines to Databricks using Terraform, in Azure


I'm trying to use a "spark_conf {}" block in my databricks cluster resource block in Terraform. This block accepts a key and a value, but I would like to add multiple of these. I was able to do it hardcoded, like this:

spark_conf = {
    (var.spark_configs[0].key) : var.spark_configs[0].value,
    (var.spark_configs[1].key) : var.spark_configs[1].value,
    (var.spark_configs[2].key) : var.spark_configs[2].value,
    (var.spark_configs[3].key) : var.spark_configs[3].value,
  }

Where the variable "spark_configs" is a list of objects.

This works, but is not really dynamic, so I'm hoping to get a solution that will just loop over the list of objects.

I tried with a dynamic block, but this will enter a "spark_conf" block for every key-value pair in the list, that's not the intention, they should all end up in one block (I think, correct me if I'm wrong)

Any ideas on how to attack this?

Thanks!


Solution

  • Yes you can use a for expression with a map constructor for this. Note that there is a misconception in the question that spark_conf is a block; it is a parameter argument that accepts a map type. Therefore a dynamic block could not be used in this situation regardless. The expression would appear like:

    spark_conf = { for spark_config in var.spark_configs : spark_config.key => spark_config.value }
    

    Note that if you optimally restructured your variable spark_configs to be a map instead of a list(object) e.g.:

    { "key1" => "value1", "key2" => "value2" }
    

    then this would simplify the parameter argument assignment to:

    spark_conf = var.spark_configs