pythontensorflowkerasgoogle-aiy

How to remove layers from a keras model in order to use as baseline for creating another model


I need to use a pre-trained model in Keras(keras.applications.VGG16) as a baseline for creating another model(for doing transfer learning) from the first layers of it. The end goal is to freeze and export the model for deployment on a raspberry pi with AIY vision kit.

I've tried the common approach of:

model_base = keras.applications.VGG16(input_tensor=inputs)
x = model_base.get_layer(backbone_layer).output
x = keras.layers.Flatten()(x)
x = keras.layers.Dense(1)(x)
model = keras.models.Model(inputs=inputs, outputs=x)

I'm also trying to use :

model_base._layers.pop() 

I call pop() n times where "n" is the number of final layers i want to get rid of.

And it seems to work in both cases, when i use new_model.summary() it only shows the desired first layers of the VGG16 model plus the new layers added for customization, however when exporting the model and compiling it for tf-lite the compiler returns:

Not enough on-device memory to run model

Which is weird since the resulting model is even smaller (the .pb file and numer of layers) than other models manually defined that can be correctly imported, after analyzing tensorboard and exporting the .pb file as text i found that the original model is being exported too( all the layers even the ones not used and removed with pop()) not just the new one

(as seen in tensorboard there are 2 parallel models, in the right the desired one, but the original layers of the original model are still shown in the left, also that part of the original model is present in the exported .pb file)

enter image description here

My question is how can i definitely remove the unused layers from the keras.applications.VGG16 model and just keep the first layers + new custom layers ? using pop() has not worked, also tried del layer (in a for loop) unsuccessfully.

Or what other alternatives do i have in order to use a pre-trained model as baseline by keeping only it's first layers and then connecting it to some other custom layers.


Solution

  • Saving the desired with model.save() , clearing tf session and then loading it again fixed the issue(as suggested @NatthaphonHongcharoen (in the comments):

    model.save(model_file)
    del model
    keras.backend.clear_session()
    model = keras.models.load_model(model_file)
    

    Now the exported graph in tensorflow shows only the desired layers and the frozen graph generates a smaller .pb file.

    However another issue persists, even using a single layer of the base model and adding a Dense layer the compiler still says

    Not enough on-device memory to run model.

    But that is a different issue not directly asked in this question.