image-processingkerasconv-neural-networkpytorchgabor-filter

Fixed Gabor Filter Convolutional Neural Networks


I'm trying to build a CNN with some conv layers where half of the filters in the layer are fixed and the other half is learnable while training the model. But I didn't find anything about that.

what I'm trying to do is similar to what they did in this paper https://arxiv.org/pdf/1705.04748.pdf

Is there a way to do that in Keras, Pytorch...


Solution

  • Sure. In PyTorch you can use nn.Conv2d and

    1. set its weight parameter manually to your desired filters
    2. exclude these weights from learning

    A simple example would be:

    import torch
    import torch.nn as nn
    
    class Model(nn.Module):
        def __init__(self):
            super(Model, self).__init__()
    
            self.conv_learning = nn.Conv2d(1, 5, 3, bias=False)
            self.conv_gabor = nn.Conv2d(1, 5, 3, bias=False)
            # weights HAVE TO be wrapped in `nn.Parameter` even if they are not learning
            self.conv_gabor.weight = nn.Parameter(torch.randn(1, 5, 3, 3))
    
        def forward(self, x):
            y = self.conv_learning(x)
            y = torch.sigmoid(y)
            y = self.conv_gabor(y)
    
            return y.mean()
    
    model = Model()
    xs = torch.randn(10, 1, 30, 30)
    ys = torch.randn(10)
    loss_fn = nn.MSELoss()
    
    # we can exclude parameters from being learned here, by filtering them
    # out based on some criterion. For instance if all your fixed filters have
    # "gabor" in name, the following will do
    learning_parameters = (param for name, param in model.named_parameters()
                                 if 'gabor' not in name)
    optim = torch.optim.SGD(learning_parameters, lr=0.1)
    
    epochs = 10
    for e in range(epochs):
        y = model(xs)
        loss = loss_fn(y, ys)
    
        model.zero_grad()
        loss.backward()
        optim.step()