deep-learningpytorchvgg-netspacy-transformers

Identical random crop on two images Pytorch transforms


I am trying to feed two images into a network and I want to do identical transform between these two images. transforms.Compose() takes one image at a time and produces output independent to each other but I want same transformation. I did my own coding for hflip() now I am interested to get the random crop. Is there any way to do that without writing custom functions?


Solution

  • I would use workaround like this - make my own crop class inherited from RandomCrop, redefining call with

        …
            if self.call_is_even :
                self.ijhw = self.get_params(img, self.size)
            i, j, h, w = self.ijhw
            self.call_is_even = not self.call_is_even
    

    instead of

        i, j, h, w = self.get_params(img, self.size)
    

    The idea is to suppress randomizer on odd calls