pythondeep-learningtransfer-learningpre-trained-modelrelu

Setting ReLU inplace to 'False'


Below I have written code which accepts a pretrained model as argument (vgg, resnet, densenet etc) and returns the model with ReLU state as 'False'. It is written after testing many different specific architectures.

I would like to re-write it in a more compact way as this does not seem optimal to me. However, I am not a developer and do not have much coding experience. Could you please assist with this?

def ReLU_inplace_to_False (model):
    
    for module in model._modules.values():
        if isinstance(module, nn.ReLU):
            module.inplace = False
        try:
            for layer in module:
                if isinstance(layer, nn.ReLU):
                    layer.inplace = False
                try:
                    for sublayer in layer._modules.values():
                        if isinstance(sublayer, nn.ReLU):
                            sublayer.inplace = False
                        try:
                            for subsublayer in sublayer._modules.values():
                                if isinstance(subsublayer, nn.ReLU):
                                    subsublayer.inplace = False
                                try:
                                    for subsubsublayer in subsublayer._modules.values():
                                        if isinstance(subsubsublayer, nn.ReLU):
                                            subsubsublayer.inplace = False
                                except:
                                    pass
                        except:
                            pass
                except:
                    pass
        except:
            pass
    
    return model

Solution

  • This calls for a recursive solution.

    def ReLU_inplace_to_False(module):
        for layer in module._modules.values():
            if isinstance(layer, nn.ReLU):
                layer.inplace = False
            ReLU_inplace_to_False(layer)