bashcurlwgetgoogle-docs-api

Download Large Google-Drive Files


I was just given a large list of 500mb files in google drive. How would I go about queuing them for download on my local Linux machine?

I can't zip them all as a large download

I can't set them all to download at once

I can't be here all day to download them in small batches.

I see from documentation like wget/curl large file from google drive that the the api is depreciated from google drive and that we can't wget them.

So what I am looking for a way for sequentially download large files from a google drive, without having to manually click through the web browser to do so.


Solution

    1. Go to OAuth 2.0 Playground
    2. In Select & authorize APIs(Step 1), Select “Drive API v3” and then select "https://www.googleapis.com/auth/drive.readonly" as the scope.
    3. Authorize the API using Google ID and exchange authorization code for tokens(Step 2). We will get the access token and refresh token
    4. Open terminal and run

    curl -H "Authorization: Bearer XXXXX" https://www.googleapis.com/drive/v3/files/YYYYY?alt=media -o ZZZZZ

    where

    XXXXX===> Access Token
    
    YYYYY===> File ID
    
    ZZZZZ===> File Name that will be saved
    

    5. Tested on Linux ( Should also work on Mac ). Windows can try with PowerShell.

    Reference: Drive REST API Documentation.