pythonscipysignal-processingfftspectrogram

How to achieve consistent scaling of spectrograms with new and old scipy APIs


In making spectrograms with scipy.signal.spectrogram and scipy.signal.ShortTimeFFT I expect to get numerically similar outputs, but instead the magnitudes of the outputs are very different. I don't understand what parameters will lead to equivalent results.

Following the examples in the docs, I make a magnitude spectrogram in three ways. Instead of having similar magnitudes, they have completely different magnitudes

import scipy
import numpy as np
np.random.seed(0)
x=np.random.random(1000)

noverlap=50
fs=1000
nperseg=100
N=len(x)

f1,t1, s1 = scipy.signal.spectrogram(x,fs=fs, scaling='spectrum', nperseg=nperseg, noverlap=noverlap, mode='magnitude')

stfft = scipy.signal.ShortTimeFFT.from_window(('tukey', 0.25), fs=1000, nperseg=100, noverlap=50, fft_mode='onesided',
                               scale_to='magnitude', phase_shift=None)
s2 = stfft.spectrogram(x)

SFT = scipy.signal.ShortTimeFFT.from_window(('tukey',.25), fs=fs, nperseg=nperseg, noverlap=noverlap,
                               fft_mode='onesided',
                               scale_to='magnitude', phase_shift=None)
Sz3 = SFT.stft(x, p0=0, p1=(N-noverlap)//SFT.hop, k_offset=nperseg//2)
t3 = SFT.t(N, p0=0, p1=(N-noverlap)//SFT.hop, k_offset=nperseg//2)
s3 = np.sqrt(Sz3.real**2 + Sz3.imag**2)

s1.mean(), s2.mean(), s3.mean()
# output: (0.026607584897909715, 0.0056786357567832615, 0.038652586739534416)

I realize that the exact values for s2 will be different because of different handling of time-bins at the edges, but the mean magnitude should be roughly similar, and I think s1 and s2 should have the same values.

Why are these different? What parameters will lead to similar results for scipy.signal.spectrogram and scipy.signal.ShrotTimeFFT, when using one-sided and magnitude (as opposed to psd) scaling?


Solution

  • I've got something very similar, but not 100% one to one.

    import matplotlitb.pyplot as plt
    
    f1,t1, s1 = scipy.signal.spectrogram(x, fs=fs, scaling='spectrum', nperseg=nperseg, noverlap=noverlap, mode='magnitude')
    
    stfft = scipy.signal.ShortTimeFFT.from_window(('tukey', 0.25), fs=1000, nperseg=100, noverlap=50, fft_mode='onesided', scale_to='magnitude')
    s2 = np.sqrt(stfft.spectrogram(x - np.mean(x)))
    
    fig, axes = plt.subplots(1, 3)
    axes[0].imshow(s1, cmap='nipy_spectral', aspect='auto')
    axes[1].imshow(s2[:, 1:-1], cmap='nipy_spectral', aspect='auto')
    axes[2].imshow(s1 - s2[:, 1:-1], cmap='nipy_spectral', aspect='auto', vmax=np.max(s1), vmin=-np.max(s1))
    plt.show()
    
    

    Note some problems in lower frequencies. Removing mean helps but it is not exactly what was done in the deprecated scipy.signal.spectrogram function.

    There is detrend parameter that probably is the reason (default "constant") - maybe it removes mean in each frame, not globally.

    enter image description here

    EDIT: YES, DETREND WAS THE PROBLEM

    Adding this parameter instead of removing mean completely solves the problem.

    s2 = np.sqrt(stfft.spectrogram(x, detr='constant'))
    

    enter image description here