machine-learningdeep-learningpytorchresnetfine-tuning

Implement Dropout to pretrained Resnet Model in Pytorch


I am trying to implement Dropout to pretrained Resnet Model in Pytorch, and here is my code

    feats_list = []
    for key, value in model._modules.items():
        feats_list.append(value)

    for feat in feats_list:
        if isinstance(feat, nn.Conv2d) or isinstance(feat, nn.Conv1d):
            feats_list.append(nn.Dropout(p=0.5, inplace=True))

    model.features = nn.Sequential(*feats_list)
    print(model.features)

I think it should apply dropout to all conv2 and conv1 layers, but in reality, only the last AdaptiveAvgPool2d has attached a drop out rate.

This is what I got

...
  (8): AdaptiveAvgPool2d(output_size=(1, 1))
  (9): Linear(in_features=2048, out_features=1000, bias=True)
  (10): Dropout(p=0.5, inplace=True)

Could someone help me? Thank you

Here is the partial code block FYI

def generic_classifier(model, criterion, optimizer, num_epochs=25):
    # try to select specific layers to freeze or unfreeze from the pretrained model
    # true:trainable;  false: freeze, untraibale
    '''
    n = 0
    for param in model.parameters():
        if n  < 7:
            param.requires_grad = True
        else:
            param.requires_grad = False
        n +=1
    '''
    feats_list = []
    for key, value in model._modules.items():
        feats_list.append(value)

    for feat in feats_list:
        if isinstance(feat, nn.Conv2d) or isinstance(feat, nn.Conv1d):
            feats_list.append(nn.Dropout(p=0.5, inplace=True))
            #print(feat)
    #print(feats_list)

    # modify convolution layers
    model.features = nn.Sequential(*feats_list)
    print(model.features)
    #for name, param in model.named_parameters():
    #    print(name, param.requires_grad)

    # remove all the fully connected layers
    model.fc = nn.Sequential()

    # add a number of fully connected layers of our choice right after the convolutional layers
    model.fc = nn.Sequential(
        # need to know the last layer of selected model architecture
        # resnet50:2048, resnet18: 512, resnet34:512
        # did not find a way to automate this part yet.
        nn.Linear(512, 256),
        nn.ReLU(),
        nn.Linear(256, 256),
        nn.ReLU(),
        nn.Linear(256, 3)
    )

    model = model.to(device)

    # train the model
    model_with_pretrained_train_acc = []
    model_with_pretrained_test_acc = []
    start = time.time()

...
...
        return model

Solution

  • The issue in your code is that you are appending the nn.Dropout layers to the feats_list within the loop that iterates over the model's modules. However, when you append a nn.Dropout layer to feats_list, it immediately gets added to the end of the list, causing the loop to continue iterating over it. So you are appending lot of dropout layers at the end of your architecture.

    The following code contains a loop that runs through all the layers of the pretrained network and if it encounters a convolutional layer it creates an exactly equal one and appends it to the list followed by a dropout layer, otherwise it appends the layer as is without adding dropout layers.

    model = models.resnet18(pretrained=True)
    
    feats_list = []
    for key, value in model.named_children():  
        if isinstance(value, nn.Conv2d) or isinstance(value, nn.Conv1d):
            feats_list.append(nn.Conv2d(
                in_channels=value.in_channels,
                out_channels=value.out_channels,
                kernel_size=value.kernel_size,
                stride=value.stride,
                padding=value.padding,
                bias=value.bias,
            ))
            feats_list.append(nn.Dropout(p=0.5, inplace=True))
        else:
            feats_list.append(value)
    
    # Create a new model with the modified layers
    model = nn.Sequential(*feats_list)