rdatabrickssparklyr

sparklyr spark_read_table from a specific database


My database admin has moved my tables from default to another database (named marketing). my query below no longer works.

this is what was done.

CREATE OR REPLACE VIEW marketing.scv_snap AS select * from delta.`dbfs:/mnt/dataLake/xxx/xxx`;
select * from marketing.scv_snap limit 10

I am unable to include the 'marketing' path in the spark_read_table below; Appreciate your help

spark_read_table(sc, 'scv_tbl', memory = FALSE)

Solution

  • You need to use tbl_change_db function to change current database:

    tbl_change_db(sc, "marketing")
    data <- spark_read_table(sc, "scv_tbl")