bashcurljqxargs

Curl list of links and save files according to URL hierarchy


From the first API call, a JSON response is returned which includes a bunch of links. I want to curl each link, and download the file to a location based on the last 3 path segments.

For example, if the URL is https://data.ninjakiwi.com/btd6/races/The_Olympics_lzlqex8k/metadata, then I want to download the file to races/The_Olympics_lzlqex8k/metadata.json. Apparently cut cannot handle indexing from the end, so I've hardcoded the number of slashes (including 2 in https://) to skip. I came up with an example command. Is there a more elegant way to write this? xargs has showed me this is probably possible without a loop and less subshell usage.

for url in $(curl https://data.ninjakiwi.com/btd6/races | jq -r '.body [] | .leaderboard, .metadata'); do         
echo $url; curl --create-dirs -o $(echo $url | cut -d/ -f5-).json $url; done

Solution

  • You don't need a loop. This should work:

    curl https://data.ninjakiwi.com/btd6/races |
    jq -r '.body[] | .leaderboard, .metadata | "url = "+., "output = "+((./"/")[-3:] | join("/"))+".json"' |
    curl -Z -K - --create-dirs