pythondatabricksazure-databricksdatabricks-repos

How do I copy a file from DBFS to REPOS on Databricks?


I'm currently working on moving a python .whl file that I have generated in dbfs to my repo located in /Workspace/Repos/My_Repo/My_DBFS_File, to commit the file to Azure DevOps.

As Databricks Repos is a Read Only location it does not permit me to programmatically copy the file to the Repo location. However, the UI provides various options to create or import files from various locations except dbfs.

Is there a workaround to actually move dbfs files to repos and then commit them to Azure DevOps?


Solution

  • The documentation says:

    Databricks Runtime 11.2 or above.

    In a Databricks Repo, you can programmatically create directories and create and append to files. This is useful for creating or modifying an environment specification file, writing output from notebooks, or writing output from execution of libraries, such as Tensorboard.

    Using a Databricks cluster with Runtime 11.2 solved my issue