pythondatabricksazure-databricksaws-databricksdatabricks-community-edition

Unable to execute Databricks REST API for data copy using Python


When i am executing the below code to "copy data from databricks --> local" its failing with an error. Can anyone please help me with how to solve this error.

import os

from databricks_cli.sdk.api_client import ApiClient
from databricks_cli.dbfs.api import DbfsApi
from databricks_cli.dbfs.dbfs_path import DbfsPath

api_client = ApiClient(host  = r"https://azuredatabricks.net/?o=XXXX",token = "121314141")


dbfs_source_file_path      = 'dbfs:/FileStore/datafilename.csv'
local_file_download_path   = 'C:/world.txt'

dbfs_api  = DbfsApi(api_client)
dbfs_path = DbfsPath(dbfs_source_file_path)

# Download the workspace file locally.
dbfs_api.get_file(dbfs_source_file_path,local_file_download_path,overwrite = True)

Error : Failing for AttributeError: 'str' object has no attribute 'absolute_path'

The code is the exact replica of documentation in https://docs.databricks.com/dev-tools/python-api.html#download-a-file-from-a-dbfs-path


Solution

  • I got the same error when I ran your code. After following the error trace, it can be seen that while using dbfs_api.get, the string path is passed instead of the object. Hence the error 'str' object has no attribute 'absolute_path'.

    enter image description here


    import os
    
    from databricks_cli.sdk.api_client import ApiClient
    from databricks_cli.dbfs.api import DbfsApi
    from databricks_cli.dbfs.dbfs_path import DbfsPath
    
    api_client = ApiClient(host  = r"https://azuredatabricks.net/?o=XXXX",token = "121314141")
    
    
    dbfs_source_file_path      = 'dbfs:/FileStore/datafilename.csv'
    local_file_download_path   = 'C:/world.txt'
    
    dbfs_api  = DbfsApi(api_client)
    dbfs_path = DbfsPath(dbfs_source_file_path)
    
    # Download the workspace file locally.
    dbfs_api.get_file(dbfs_path,local_file_download_path,overwrite = True) #change here