dataframepysparkdatabricksetldatabricks-community-edition

Do databricks secrets work with community edition?


I'm trying to create an ETL project for resume fluff. The idea is that I would have real estate data web-scraped with python (done), then cleaned with pyspark in databricks (done), then have the dataframe pushed to AWS DynamoDB and maybe output to a dashboard in PowerBI.

However, for obvious reasons I don't want my AWS access tokens hard-coded in the script. I'm using databricks community edition and it seems debatable (despite searching) whether secrets are available on this platform. Can anyone point me in the right direction?

If not, I could install pyspark locally and just use environment variables, but a databricks notebook has that snazzy kind of look that works well for a resume fluff project.


Solution

  • No, secrets aren't enabled on the community edition. If you try to execute any related command, such as:

    dbutils.secrets.listScopes()
    

    it will give the following error:

    Secrets API is not enabled for this workspace.