I have a piece of code thus:
mydb <- dbConnect(MySQL(), user = "****", password = "****", dbname = "c_global", host = "")
df_ipt_rate_history <- fetch(dbSendQuery(mydb, "select * from ipt_rate_history"), n=-1)
df_map_occupation <- fetch(dbSendQuery(mydb, "select * from map_occupation"), n=-1)
df_map_titles <- fetch(dbSendQuery(mydb, "select * from map_titles"), n=-1)
df_pcode_full_areas <- fetch(dbSendQuery(mydb, "select * from pcode_full_areas"), n=-1)
df_pcode_rics_regions <- fetch(dbSendQuery(mydb, "select * from pcode_rics_regions"), n=-1)
df_pcode_sector_areas <- fetch(dbSendQuery(mydb, "select * from pcode_sector_areas"), n=-1)
...and so on (multiple tables)
dbDisconnect(mydb)
It works fine but is there a way to set up some kind of loop that reads every table in a DB (and outputs to a data frame) in a single command?
Thank you
You can use dbListTables
to list all your tables within your database, and then read all their lines and columns using the utility function dbReadTable
.
all_data <- lapply(dbListTables(mydb), dbReadTable, conn = mydb)
Btw, unless it is necessary, it is maybe not a good idea to load in memory all the data if it can be read from a database when needed.