pythontensorflowdeep-learningtensorflow2.0tensorflow-hub

Running TF Hub Model sfaster by transfer learning?


I have a project where I want to detect humans from images. I have implemented the code so that I use pre-trained TF Hub Models that are available at: https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/tf2_detection_zoo.md. These models are already capable of recognising humans but they also recognize many other objects too (trained on COCO dataset). I am wondering if I remove the head of these pre-trained models and just train them to recognize humans would it be run considerably faster than it does by using these models out of the box ? Thanks for your help.


Solution

  • Transfer learning nearly always helps. In this case, the complexity of features learned would increase as the network's layer goes deeper.

    Therefore, you can feel free to remove the last FC layer and make it your own. You can even remove the last several layers and add your own, and that could also give you a nearly equivalent boost, by helping your model converges faster.