matplotlibhdf5astropypolar-coordinatesfits

Plotting mean and standard dev values on skyplot using astropy from hdf5 file


I am trying to create a skyplot(using astropy) containing mean and standard dev values from a hdf5 file. The link to the data is https://wwwmpa.mpa-garching.mpg.de/~ensslin/research/data/faraday2020.html (Faraday Sky 2020). I have programmed the following code until now, where data is read from the hdf5 file to ggl and ggb, after which values are converted to galactic coordinates (l and b) in gb and gl. I need these values to be plotted in the skyplot.

from astropy import units as u
from astropy.coordinates import SkyCoord
import matplotlib.pyplot as plt
import numpy as np
import h5py

dat = []

ggl=[]

ggb=[]

with h5py.File('faraday2020.hdf5','r') as hdf:
    print(list(hdf.keys()))
    faraday_sky_mean = hdf['faraday_sky_mean'][:]
    faraday_sky_std = hdf['faraday_sky_std'][:]
    
print(faraday_sky_mean.shape, faraday_sky_mean.dtype) 
print(f'Max Mean={max(faraday_sky_mean)}, Min Mean={min(faraday_sky_mean)}') 
print(faraday_sky_std.shape, faraday_sky_std.dtype) 
print(f'Max StdDev={max(faraday_sky_std)}, Min StdDev={min(faraday_sky_std)}') 

ggl = faraday_sky_mean.tolist()
print(len(ggl),type(ggl[0]))
ggb = faraday_sky_std.tolist()
print(len(ggb),type(ggb[0]))

gl = ggl * u.degree
gb = ggb * u.degree


c = SkyCoord(l=gl,b=gb, frame='galactic', unit = (u.deg, u.deg)) #, 

l_rad = c.l.wrap_at(180 * u.deg).radian
b_rad = c.b.radian

###
plt.figure(figsize=(8,4.2))
plt.subplot(111, projection="aitoff")

plt.title("Mean and standard dev", y=1.08, fontsize=20)
plt.grid(True)

P1=plt.plot(l_rad, b_rad,c="blue", s=220, marker="h", alpha=0.7) #, 

plt.subplots_adjust(top=0.95, bottom=0.0)
plt.xlabel('l (deg)', fontsize=20)
plt.ylabel('b (deg)', fontsize=20)

plt.subplots_adjust(top=0.95, bottom=0.0)
plt.show()

However, I am getting the following error:

'got {}'.format(angles.to(u.degree)))

ValueError: Latitude angle(s) must be within -90 deg <= angle <= 90 deg, got [1.12490771 0.95323024 0.99124631 ... 4.23648627 4.28821608 5.14498169] deg

Please help me on how to fix this.


Solution

  • This is an extension to my previous answer. The original post wanted to plot Mean and Standard Deviation of the Faraday Sky 2020 data on an astropy skyplot. The referenced data source (from Radboud University) only included the mean and standard deviation. The associated longitude and latitude coordinates were obtained from the NASA Goddard LAMBDA-Tools site. The code below shows how to merge the data from both files into a single HDF5 file. For convenience, links to the data sources are repeated here:

    The resulting file is named: "faraday2020_with_coords.h5".

    from astropy.io import fits
    import h5py
    
    fits_file = 'pixel_coords_map_ring_galactic_res9.fits'
    faraday_file = 'faraday2020.hdf5'
    
    with fits.open(fits_file) as hdul, \    
         h5py.File(faraday_file,'r') as h5r, \
         h5py.File('faraday2020_with_coords.h5','w') as h5w:
    
        arr = hdul[1].data
    
        dt = [('LONGITUDE', float), ('LATITUDE', float), \
              ('faraday_sky_mean', float), ('faraday_sky_std', float) ]
                                 
        ds = h5w.create_dataset('skyplotdata', shape=(arr.shape[0],), dtype=dt)
        ds['LONGITUDE'] = arr['LONGITUDE'][:]
        ds['LATITUDE']  = arr['LATITUDE'][:]
        ds['faraday_sky_mean'] = h5r['faraday_sky_mean'][:]
        ds['faraday_sky_std']  = h5r['faraday_sky_std'][:]