tensorflowtensorflow-layers

Alternative to arg_scope when using tf.layers


I'm rewriting tf.contrib.slim.nets.inception_v3 using tf.layers. Unfortunately the new tf.layers module does not work with arg_scope, as it does not have the necessary decorators. Is there better mechanism in place that I should use to set default paramters for layers? Or should I simply add a proper arguments to each layer and remove the arg_scope?

Here is an example that uses the arg_scope:

with variable_scope.variable_scope(scope, 'InceptionV3', [inputs]):
    with arg_scope(
        [layers.conv2d, layers_lib.max_pool2d, layers_lib.avg_pool2d],
        stride=1,
        padding='VALID'):

Solution

  • There isn't another mechanism that lets you define default values in core TensorFlow, so you should specify the arguments for each layer.

    For instance, this code:

    with slim.arg_scope([slim.fully_connected], 
        activation_fn=tf.nn.relu, 
        weights_initializer=tf.truncated_normal_initializer(stddev=0.01),
        weights_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005)):
      x = slim.fully_connected(x, 800)
      x = slim.fully_connected(x, 1000)
    

    would become:

    x = tf.layers.dense(x, 800, activation=tf.nn.relu,
          kernel_initializer=tf.truncated_normal_initializer(stddev=0.01),
          kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
    x = tf.layers.dense(x, 1000, activation=tf.nn.relu,
          kernel_initializer=tf.truncated_normal_initializer(stddev=0.01),
          kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
    

    Alternatively:

    with tf.variable_scope('fc', 
        initializer=tf.truncated_normal_initializer(stddev=0.01)):
      x = tf.layers.dense(x, 800, activation=tf.nn.relu,
          kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
      x = tf.layers.dense(x, 1000, activation=tf.nn.relu,
          kernel_regularizer=tf.contrib.layers.l2_regularizer(scale=0.0005))
    

    Make sure to read the documentation of the layer to see which initializers default to the variable scope initializer. For example, the dense layer's kernel_initializer uses the variable scope initializer, while the bias_initializer uses tf.zeros_initializer().