pythontensorflowmachine-learningdeep-learningpruning

Why parameters in prunning increases in tensorflow's tfmot


I was prunning a model and came across a library TensorFlow model optimization so initially, we have in this image my model have 20410 parameters in total

I trained this model on a default dataset and it gave me an accuracy of 96 percent which is good. then I saved the model in a JSON file and saved its weight in h5 file now I loaded this model into another script to prune it after applying prunning and compiling the model I got this model summary enter image description here

although the model is prunned well and there is a significant amount of reduction in parameters but the problem here is why parameters increased after applying the prunning and also even after rmoving non-trainable parameters still the prunned and simple model has same number of parameters can anyone explain me if this is normal or i am doing something wrong. Also please explain why this is happening. Thank you in advance to all of you :)


Solution

  • It is normal. Pruning doesn't change the original model's structure. So it is not meant to reduce the number of parameters.

    Pruning is a model optimization technique that eliminates not commonly used(by other words you can say unnecessary) values in the weights.

    2nd model summary shows the parameters added for pruning. They are the non-trainable parameters. Non-trainable parameters stand for masking. In a nutshell, tensorflow adds non-trainable masks to each of the weights in the network to specify which of the weights should be pruned. The masks consist of 0s and 1s.