初始化问题,tf1.14
来源:2-6 实战回调函数
慕妹7897946
2020-01-08
把上边换成下边的为嘛机器学习准确率降低了,好奇怪】
写回答
1回答
-
正十七
2020-01-09
class Dense(tensorflow.python.keras.engine.base_layer.Layer) | Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regu larizer=None, kernel_constraint=None, bias_constraint=None, **kwargs) | | Just your regular densely-connected NN layer. | | `Dense` implements the operation: | `output = activation(dot(input, kernel) + bias)` | where `activation` is the element-wise activation function | passed as the `activation` argument, `kernel` is a weights matrix | created by the layer, and `bias` is a bias vector created by the layer | (only applicable if `use_bias` is `True`). | | Note: If the input to the layer has a rank greater than 2, then | it is flattened prior to the initial dot product with `kernel`. | | Example: | | ```python | # as first layer in a sequential model: | model = Sequential() | model.add(Dense(32, input_shape=(16,))) | # now the model will take as input arrays of shape (*, 16) | # and output arrays of shape (*, 32) | | # after the first layer, you don't need to specify | # the size of the input anymore: | model.add(Dense(32)) | ``` | | Arguments: | units: Positive integer, dimensionality of the output space. | activation: Activation function to use. | If you don't specify anything, no activation is applied | (ie. "linear" activation: `a(x) = x`). | use_bias: Boolean, whether the layer uses a bias vector. | kernel_initializer: Initializer for the `kernel` weights matrix. | bias_initializer: Initializer for the bias vector. | kernel_regularizer: Regularizer function applied to | the `kernel` weights matrix. | bias_regularizer: Regularizer function applied to the bias vector. | activity_regularizer: Regularizer function applied to | the output of the layer (its "activation").. | kernel_constraint: Constraint function applied to | the `kernel` weights matrix. | bias_constraint: Constraint function applied to the bias vector.
同学你好,以上是Dense layer的构造函数API, 从代码看,参数中没有init参数,所以如果写init的话,那么用的应该是默认的glorot_uniform初始化方法,这个可能是准确率不一样的原因。
00
相似问题