pythonpysparkspark-notebookmicrosoft-fabricdata-lakehouse

Get the list of tables from fabric workspace using abfss path


I am currently attempting to retrieve the list of tables from a lakehouse located in a separate workspace by utilizing the ABFS (Azure Blob File System) path. Despite my efforts, the codes I have employed thus far have not yielded the desired outcome. I am now contemplating whether this task is feasible at all. Code Example shown below.

olspath = "abfss://path................"




#df=spark.read.format('delta').load(olspath)
#df=spark.read.load(olspath)
df=spark.read.schema(olspath)
#df.write.mode("overwrite").format('delta').save("Tables/"+"Account")
df.show()

Solution

  • Using dbutils you can get tables path, by checking them if it is delta or not you get tables.

    dbutils.fs.ls("<Your_abfss_path>")
    

    enter image description here

    code:

    from delta.tables import *
    files = dbutils.fs.ls("Your_abfss_path")
    print(f"Tables in given path")
    for i in files:
        if DeltaTable.isDeltaTable(spark,i[0]):
            DeltaTable.forPath(spark,i[0]).toDF().show()
    

    enter image description here

    In Onelake

    enter image description here