I want a pure numpy solution if possible.
I have a grayscale image. I have sampled 75 (15x5) cells in the image, calculating the average pixel value for each cell, resulting in a 15x5 matrix. Now, I want to construct a z score matrix, that uses a row-wise average and row-wise standard deviation. So, each element in the row consisting of 5 columns is a z score for that element calculated over that row.
I have tried the following code:
pixelzscore=(pixelscore-np.average(pixelscore,axis=1))/np.std(pixelscore,axis=1)
This expectedly does not work as the sizes of the operands are not equal.
Is there a pure numpy pythonic way to calculate this quickly without resorting to the scipy function? I just do not want to load additional packages that I don't actually need.
Since you have the same mean/std in every cell of a row, you can calculate this factor once per row, then use np.tile
to repeat over all columns, e.g.:
pixelzscore = pixelscore - np.tile(np.mean(pixelscore, axis=1)/np.std(pixelscore, axis=1), (5, 1)).T