I'm trying to run a Resnet-like image classification model on a CPU, and want to know the breakdown of time it takes to run each layer of the model.
The issue I'm facing is the github link https://github.com/facebookresearch/semi-supervised-ImageNet1K-models has the model saved as a .pth
file. It is very large (100s of MB), and I don't know exactly how it differs from pytorch except it's binary.
I load the model from this file using the following script. But I don't see a way to modify the model or insert the t = time.time()
variables/statements in between model layers to break down the time in each layer.
Questions:
Would running the model in the following script give a correct estimate of end-to-end time (t2-t1) it takes to run the model on the CPU, or would it also include pytorch compilation time?
How to insert time statements between consecutive layers to get a breakdown?
There is no inference/training script at the github link and only has the .pth file. So how exactly is one supposed to run inference or training? How to insert additional layers between consecutive layers of the .pth model and save them?
#!/usr/bin/env python
import torch torchvision time
model=torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext50_32x4d_swsl', force_reload=False)
in = torch.randn(1, 3, 224, 224)
t1 = time.time()
out = model.forward(in)
t2 = time.time()
```**strong text**
A simple way to implement such a requirement is by registering forward hooks on each module of the model which updates a global variable for storing the time and computes the time difference between the last and current computations.
For example:
import torch
import torchvision
import time
global_time = None
exec_times = []
def store_time(self, input, output):
global global_time, exec_times
exec_times.append(time.time() - global_time)
global_time = time.time()
model = torch.hub.load('facebookresearch/semi-supervised-ImageNet1K-models', 'resnext50_32x4d_swsl', force_reload=False)
x = torch.randn(1, 3, 224, 224)
# Register a hook for each module for computing the time difference
for module in model.modules():
module.register_forward_hook(store_time)
global_time = time.time()
out = model(x)
t2 = time.time()
for module, t in zip(model.modules(), exec_times):
print(f"{module.__class__}: {t}")
The output I get is:
<class 'torchvision.models.resnet.ResNet'>: 0.004999876022338867
<class 'torch.nn.modules.conv.Conv2d'>: 0.002006053924560547
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009946823120117188
<class 'torch.nn.modules.activation.ReLU'>: 0.007998466491699219
<class 'torch.nn.modules.pooling.MaxPool2d'>: 0.0010004043579101562
<class 'torch.nn.modules.container.Sequential'>: 0.0020003318786621094
<class 'torchvision.models.resnet.Bottleneck'>: 0.0010023117065429688
<class 'torch.nn.modules.conv.Conv2d'>: 0.017997026443481445
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010018348693847656
<class 'torch.nn.modules.conv.Conv2d'>: 0.0009999275207519531
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.003000497817993164
<class 'torch.nn.modules.conv.Conv2d'>: 0.003999948501586914
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.001997232437133789
<class 'torch.nn.modules.activation.ReLU'>: 0.004001140594482422
<class 'torch.nn.modules.container.Sequential'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.001999378204345703
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torchvision.models.resnet.Bottleneck'>: 0.003001689910888672
<class 'torch.nn.modules.conv.Conv2d'>: 0.0020008087158203125
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009992122650146484
<class 'torch.nn.modules.conv.Conv2d'>: 0.0019991397857666016
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010001659393310547
<class 'torch.nn.modules.conv.Conv2d'>: 0.0009999275207519531
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002998828887939453
<class 'torch.nn.modules.activation.ReLU'>: 0.0010013580322265625
<class 'torchvision.models.resnet.Bottleneck'>: 0.0029997825622558594
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002999544143676758
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010006427764892578
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.001001119613647461
<class 'torch.nn.modules.conv.Conv2d'>: 0.0019979476928710938
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010018348693847656
<class 'torch.nn.modules.activation.ReLU'>: 0.0010001659393310547
<class 'torch.nn.modules.container.Sequential'>: 0.00299835205078125
<class 'torchvision.models.resnet.Bottleneck'>: 0.002004384994506836
<class 'torch.nn.modules.conv.Conv2d'>: 0.0009975433349609375
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.005999088287353516
<class 'torch.nn.modules.conv.Conv2d'>: 0.0020003318786621094
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010001659393310547
<class 'torch.nn.modules.activation.ReLU'>: 0.0020017623901367188
<class 'torch.nn.modules.container.Sequential'>: 0.0009970664978027344
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0029997825622558594
<class 'torchvision.models.resnet.Bottleneck'>: 0.0010008811950683594
<class 'torch.nn.modules.conv.Conv2d'>: 0.00500035285949707
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009984970092773438
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020020008087158203
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019979476928710938
<class 'torch.nn.modules.activation.ReLU'>: 0.0010018348693847656
<class 'torchvision.models.resnet.Bottleneck'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.00099945068359375
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.001001119613647461
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002997875213623047
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010013580322265625
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002000570297241211
<class 'torch.nn.modules.activation.ReLU'>: 0.0
<class 'torchvision.models.resnet.Bottleneck'>: 0.001997232437133789
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010008811950683594
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.001001596450805664
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.00099945068359375
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002998828887939453
<class 'torch.nn.modules.activation.ReLU'>: 0.0010020732879638672
<class 'torch.nn.modules.container.Sequential'>: 0.0010020732879638672
<class 'torchvision.models.resnet.Bottleneck'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.001995563507080078
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.002001523971557617
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010001659393310547
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010008811950683594
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.activation.ReLU'>: 0.0029985904693603516
<class 'torch.nn.modules.container.Sequential'>: 0.0009989738464355469
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010068416595458984
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torchvision.models.resnet.Bottleneck'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.004993438720703125
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010013580322265625
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010001659393310547
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010018348693847656
<class 'torch.nn.modules.conv.Conv2d'>: 0.001997709274291992
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.activation.ReLU'>: 0.0019991397857666016
<class 'torchvision.models.resnet.Bottleneck'>: 0.0029990673065185547
<class 'torch.nn.modules.conv.Conv2d'>: 0.0030128955841064453
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019872188568115234
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0029993057250976562
<class 'torch.nn.modules.activation.ReLU'>: 0.0010008811950683594
<class 'torchvision.models.resnet.Bottleneck'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010006427764892578
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009992122650146484
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.003001689910888672
<class 'torch.nn.modules.conv.Conv2d'>: 0.0019986629486083984
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0010008811950683594
<class 'torch.nn.modules.activation.ReLU'>: 0.0
<class 'torchvision.models.resnet.Bottleneck'>: 0.002000093460083008
<class 'torch.nn.modules.conv.Conv2d'>: 0.0019986629486083984
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020012855529785156
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019981861114501953
<class 'torch.nn.modules.activation.ReLU'>: 0.0030014514923095703
<class 'torchvision.models.resnet.Bottleneck'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0029985904693603516
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010008811950683594
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0010013580322265625
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009989738464355469
<class 'torch.nn.modules.activation.ReLU'>: 0.0
<class 'torch.nn.modules.container.Sequential'>: 0.002998828887939453
<class 'torchvision.models.resnet.Bottleneck'>: 0.002000570297241211
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.003000497817993164
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020020008087158203
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0009982585906982422
<class 'torch.nn.modules.activation.ReLU'>: 0.0009996891021728516
<class 'torch.nn.modules.container.Sequential'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0029990673065185547
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0020003318786621094
<class 'torchvision.models.resnet.Bottleneck'>: 0.0010025501251220703
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019981861114501953
<class 'torch.nn.modules.conv.Conv2d'>: 0.0019996166229248047
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019996166229248047
<class 'torch.nn.modules.activation.ReLU'>: 0.0
<class 'torchvision.models.resnet.Bottleneck'>: 0.0030002593994140625
<class 'torch.nn.modules.conv.Conv2d'>: 0.0020012855529785156
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.0
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0
<class 'torch.nn.modules.conv.Conv2d'>: 0.006000518798828125
<class 'torch.nn.modules.batchnorm.BatchNorm2d'>: 0.0019979476928710938
<class 'torch.nn.modules.activation.ReLU'>: 0.0
<class 'torch.nn.modules.pooling.AdaptiveAvgPool2d'>: 0.002003192901611328
<class 'torch.nn.modules.linear.Linear'>: 0.0019965171813964844
Process finished with exit code 0