Norm and Regularization Layers

Batch-Normalization Layer

class returnn.tf.layers.basic.BatchNormLayer(**kwargs)[source]

Implements batch-normalization (http://arxiv.org/abs/1502.03167) as a separate layer.

All kwargs which are present in our base class are passed to our base class. All remaining kwargs are used for self.batch_norm().

layer_class = 'batch_norm'[source]

Dropout Layer

class returnn.tf.layers.basic.DropoutLayer(extra_deps=(), **kwargs)[source]

Just the same as CopyLayer, because that one already supports dropout.

Parameters:extra_deps (list[LayerBase]) – Just add as an additional dependency, without really using it. This can have an effect though on the search beam, via SelectSearchSourcesLayer. We only have this here for the CopyLayer because the get_out_data_from_opts() must know about it and define the right beam. Also see the option collocate_with, which is different in that it does not add a dependency.
layer_class = 'dropout'[source]

Layer-Normalization Layer

class returnn.tf.layers.basic.LayerNormLayer(epsilon=1e-06, **kwargs)[source]

Applies layer-normalization.

layer_class = 'layer_norm'[source]
classmethod get_out_data_from_opts(sources, name, **kwargs)[source]
Parameters:
  • sources (list[LayerBase]) –
  • name (str) –
Return type:

Data