kerasneural-networkkeras-layer

How to add random value to each weight of a Keras model


I got a nontrivial model with several nontrivial layers for which I'd like to do the following: I create a random vector of values v with length equal to the number of parameters of the model. Now I'd like to add the vector v to the models parameters in the intuitive way, i.e., every component of v is added to a single parameter of the model. This sum is then set as new weights. So basically all I need is all the model parameters as a N x 1 vector, add v to that Nx1 vector and then put that Nx1 vector back into the model.

Currently I do something like (untested):

N = model.count_params()
    b = GenerateVectorV(N)
    weights_per_layer = model.get_weights()
    
    idx = 0
    for i in range(0, len(weights_per_layer)):
        sec = b[idx:idx+weights_per_layer[i].size]
        idx += weights_per_layer[i].size
        weights_per_layer[i] = weights_per_layer[i] + np.reshape(sec, weights_per_layer[i].shape) 

and then I could update the model with the changed weights. This feels too much like C code, I am sure there is a simpler way.


Solution

  • Lets imagine that you just want to add Gaussian noise (just replace with any other noise generator), then the logic is really just one liner

    new_weights = [weight + np.random.normal(size=weight.shape)
                   for weight in model.get_weights()]
    

    or even simpler

    new_weights = [np.random.normal(loc=weight) for weight in model.get_weights()]