pythonwgeturlopen

Struggling to grab data from website using python


I'm trying to grab snowfall data from the National Weather Service at this site:

https://www.nohrsc.noaa.gov/snowfall/

The data can be downloaded via the webpage with a 'click' on the file type in the drop down, but I can't seem to figure out how to automate this using python. They have an ftp archive, but it requires a login and I can't access it for some reason.

However, since the files can be downloaded via a "click" on the webpage interface, I imagine there must be a way to grab it using wget or urlopen? But I can't seem to figure out what the exact url address would be in this case in order to use those functions. Does anyone have any ideas on how to download this data straight from the website listed above?

Thanks!


Solution

  • You can inspect links with Chrome Console.

    Press F12, then click on file type:

    enter image description here

    Here an URL https://www.nohrsc.noaa.gov/snowfall/data/202112/sfav2_CONUS_6h_2021122618_grid184.nc

    You can download it with python using Requests library

    import requests
    
    r = requests.get('https://www.nohrsc.noaa.gov/snowfall/data/202112/sfav2_CONUS_6h_2021122618_grid184.nc')
    
    data = r.content # file context
    

    Or you can just save it to file with urlretrieve

    from urllib.request import urlretrieve
    url = 'https://www.nohrsc.noaa.gov/snowfall/data/202112/sfav2_CONUS_6h_2021122618_grid184.nc'
    dst = 'data.nc'
    urlretrieve(url, dst)