Unstable results in test mode with fractional max pooling in PyTorch

I make some variants of ResNet, originally found in TorchVision, modify them, train them and so on. What I have found is that even in .eval() mode, even if I load state right before evaluation, I receive different results. The code looks like

...
imageData = imageDataset.getImage(imageNum)
imageData = np.expand_dims(imageData, 0)
pytImageData = torch.from_numpy(imageData).cuda()
...
self.loadState(self.curEpochNum)
model.eval()
self.pytOptimizer.zero_grad()
with torch.set_grad_enabled(False):
    activations = model.forward(pytImageData).cpu().numpy()

    self.loadState(self.curEpochNum)
    activations2 = model.forward(pytImageData).cpu().numpy() 
    diff = activations2 - activations
    print('diff', diff.min(), diff.max())

Difference is quite high. I have found this during investigation of occlusion heatmaps, they were quite noisy (and I would even say gave strange results, but maybe this is just one more mystery of neural nets, with different nature).

ImageNet-1000 dataset,
torch version 1.5.0.dev20200128,
torchvision version 0.6.0.dev20200128

Another question appears given that this is caused by fractional max pooling: how to make it stable? As I suppose, only random mode is implemented in PyTorch, so random positions are selected for each convolution at each run. But the authour also invented pseudorandom mode. Is it possible for example to turn it on somehow?

Topic pytorch convolutional-neural-network

Category Data Science


The answer (maybe one of) is FractionalMaxPool2d layer I use. If I use say ResNet-34, train it a bit (say 1/6 of epoch), everything is ok. If I add one FractionalMaxPool2d layer:

...
self.layer1 = self._make_layer(block, 2, 64, layers[0])
self.maxpool2 = nn.FractionalMaxPool2d(kernel_size=3, output_size=38) 
self.layer2 = self._make_layer(block, 3, 128, layers[1], stride=2,
...

(and run it in the forward method of course)

and repeat, the difference appears.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.