pythontensorflowkerasefficientnet

Extracting features from EfficientNet Tensorflow


I have a CNN model trained using EfficientNetB6. My task is to extract the features of this trained model by removing the last dense layer and then using those weights to train a boosting model. i did this using Pytorch earlier and was able to extract the weights from the layers i was interested and predicted on my validation set and then boosted.

I am doing this now in tensorflow but currently stuck. Below is my model structure and I have tried using the code on the website but did not had any luck. enter image description here

I want to remove the last dense layer and predict on the validation set using the remaining layers.

I tried using :

layer_name = 'efficientnet-b6' intermediate_layer_model = tf.keras.Model(inputs = model.input, outputs = model.get_layer(layer_name).output)

but i get an error " ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_1:0", shape=(None, 760, 760, 3), dtype=float32) at layer "input_1". The following previous layers were accessed without issue: []"

Any way to resolve this?


Solution

  • Sorry my bad. I simply added a GlobalAveragePooling2D layer after the efficientnet layer and i am able to extract the weights and continue :)

    just for reference:

    def build_model(dim=CFG['net_size'], ef=0):
        inp = tf.keras.layers.Input(shape=(dim,dim,3))
        base = EFNS[ef](input_shape=(dim,dim,3),weights='imagenet',include_top=False)
        x = base(inp)
        x = tf.keras.layers.GlobalAveragePooling2D()(x)
        x = tf.keras.layers.Dense(1,activation='sigmoid')(x)
        model = tf.keras.Model(inputs=inp,outputs=x)
        opt = tf.keras.optimizers.Adam(learning_rate=0.001)
        loss = tf.keras.losses.BinaryCrossentropy(label_smoothing=0.05) 
        model.compile(optimizer=CFG['optimizer'],loss=loss,metrics=[tf.keras.metrics.AUC(name='auc')])
        print(model.summary())
        return model