关于activation(bn)

来源:5-11 批归一化实战(2)

Casablanca_

2020-03-05

activation(bn)
为什么可以直接可以这样用,activation这个函数是怎么来的,它不属于tf吗

写回答

1回答

正十七

2020-03-05

可以看代码:

def conv_wrapper(inputs, 
                 name, 
                 is_training,
                 output_channel = 32,
                 kernel_size = (3,3),
                 activation = tf.nn.relu,
                 padding = 'same'):
    """wrapper of tf.layers.conv2d"""
    # without bn: conv -> activation
    # with batch normalization: conv -> bn -> activation
    with tf.name_scope(name):
        conv2d = tf.layers.conv2d(inputs,
                                  output_channel,
                                  kernel_size,
                                  padding = padding,
                                  activation = None,
                                  name = name + '/conv2d')
        bn = tf.layers.batch_normalization(conv2d,
                                           training = is_training)
        return activation(bn)

这里的activation是tensorflow里的激活函数啊,只不过它是从外面传进来的。默认是tf.nn.relu.

0
0

深度学习之神经网络(CNN/RNN/GAN)算法原理+实战

深度学习算法工程师必学,深入理解深度学习核心算法CNN RNN GAN

2617 学习 · 935 问题

查看课程