databricksazure-databrickscross-joindatabricks-sqlcross-apply

Is there a way to use CROSS APPLY from SQL to Spark SQL?


I have complex stored procedure that uses multiple views/functions inside and inside there are multiple cross applies and I am not sure if there is a "easy solution" to replicate it in spark.sql

df = spark.sql(f""" 
select * from table
CROSS APPLY (
some business rule )
""")

I checked the JOIN documentation but didn`t found what I want


Solution

  • You must first of all enable this option in your notebook

    spark.sql("SET spark.sql.crossJoin.enabled=true")