contrib.layers.layer_norm
tf.contrib.layers.layer_norm
tf.contrib.layers.layer_norm
layer_norm( inputs, center=True, scale=True, activation_fn=None, reuse=None, variables_collections=None, outputs_collections=None, trainable=True, scope=None )
Defined in tensorflow/contrib/layers/python/layers/layers.py
.
See the guide: Layers (contrib) > Higher level ops for building neural network layers
Adds a Layer Normalization layer from https://arxiv.org/abs/1607.06450.
"Layer Normalization"
Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton
Can be used as a normalizer function for conv2d and fully_connected.
Args:
-
inputs
: A tensor with 2 or more dimensions. The normalization occurs over all but the first dimension. -
center
: If True, add offset ofbeta
to normalized tensor. If False,beta
is ignored. -
scale
: If True, multiply bygamma
. If False,gamma
is not used. When the next layer is linear (also e.g.nn.relu
), this can be disabled since the scaling can be done by the next layer. -
activation_fn
: Activation function, default set to None to skip it and maintain a linear activation. -
reuse
: Whether or not the layer and its variables should be reused. To be able to reuse the layer scope must be given. -
variables_collections
: Optional collections for the variables. -
outputs_collections
: Collections to add the outputs. -
trainable
: IfTrue
also add variables to the graph collectionGraphKeys.TRAINABLE_VARIABLES
(see tf.Variable). -
scope
: Optional scope forvariable_scope
.
Returns:
A Tensor
representing the output of the operation.
Raises:
-
ValueError
: If rank or last dimension ofinputs
is undefined.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/contrib/layers/layer_norm