I'm helping to close a organization that is running out of business. One of the tasks is to make a backup of all our datasets in BigQuery for at least 5 year for legal purposes. Since this is quite a long time, I would like to minimize cost as much as possible.
Since we'll close our google workspace of the company we created a new backup account to transfer all data. We tried the following:
I'm glad to find new ideas. Thanks!
As suggested by Guillaume you can try to Export all the tables to Cloud Storage. Then, do what you want: set the class to archive to minimize cost, download the file, gzip them, and upload them where you want (Google Drive, Cloud Storage archive class),... In any case, Cloud Storage is the required point of passage for the next steps!
Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future. Feel free to edit this answer for additional information.