关于activation(bn)
来源:5-11 批归一化实战(2)
Casablanca_
2020-03-05
activation(bn)
为什么可以直接可以这样用,activation这个函数是怎么来的,它不属于tf吗
写回答
1回答
-
正十七
2020-03-05
可以看代码:
def conv_wrapper(inputs, name, is_training, output_channel = 32, kernel_size = (3,3), activation = tf.nn.relu, padding = 'same'): """wrapper of tf.layers.conv2d""" # without bn: conv -> activation # with batch normalization: conv -> bn -> activation with tf.name_scope(name): conv2d = tf.layers.conv2d(inputs, output_channel, kernel_size, padding = padding, activation = None, name = name + '/conv2d') bn = tf.layers.batch_normalization(conv2d, training = is_training) return activation(bn)
这里的activation是tensorflow里的激活函数啊,只不过它是从外面传进来的。默认是tf.nn.relu.
00
相似问题
为什么conv+bn+relu是有效的?
回答 1
为什么BN错误的代码一直没有更新呀
回答 2