When using the drop in FFTW replacement for scipy_fftpack, does it compute the wisdom once per FFT size and then never again?
I have a loop which repeated calls
freqImg = np.fft.fftshift(pyfftw.interfaces.scipy_fftpack.fft2(slc, threads=1, planner_effort='FFTW_PATIENT'))
For reasons I cannot go into, I cannot do the FFT in batch. But, is the wisdom recomputed every iteration of the loop or does the planner cache the wisdom and use it for each iteration of the loop?
Yes, the previously computed wisdom is used on subsequent iterations of the loop, but is forgotten once the process ends (unless explicitly stored and reloaded). The interfaces
API also has the facility to cache the FFTW object and reuse it should that be possible and desired. I suggest looking the docs for that.