Recently I use tf.profile to calculate FLOPs of ResNet-v1-50. I get 7084572224 (7.08 GFLOPs ?). But in original paper it is 3.8 GFLOPs.
And I perform same on VGG-19 and get 5628853928 (56.29 GFLOPs?), but its real value is 19.6 billion FLOPs. Note that all test model is in tf.slim.
My code is as followed:
run_meta = tf.RunMetadata()
im = tf.placeholder(tf.float32, [1, 224, 224, 3])
with arg_scope(resnet_v1.resnet_arg_scope(use_batch_norm=True)):
ims, endpoints = resnet_v1.resnet_v1_50(im)
print(get_num_of_params(tf.get_default_graph()))
opts = tf.profiler.ProfileOptionBuilder.float_operation()
flops = tf.profiler.profile(tf.get_default_graph(), run_meta=run_meta, cmd='op', options=opts)
print(flops.total_float_ops)
Please someone help me.
According to the last 2 lines of page 3 of the original paper https://arxiv.org/pdf/1512.03385.pdf, their calculation only considers multiply and add operations. Whereas tensorflow further includes batch norm or max operations of pooling, relu. I think that is the reason for the difference.