Norm and Regularization Layers

Batch-Normalization Layer


Implements batch-normalization ( as a separate layer.

All kwargs which are present in our base class are passed to our base class. All remaining kwargs are used for self.batch_norm().

layer_class = 'batch_norm'[source]

Dropout Layer

class, **kwargs)[source]

Just the same as CopyLayer, because that one already supports dropout.

Parameters:extra_deps (list[LayerBase]) – Just add as an additional dependency, without really using it. This can have an effect though on the search beam, via SelectSearchSourcesLayer. We only have this here for the CopyLayer because the get_out_data_from_opts() must know about it and define the right beam. Also see the option collocate_with, which is different in that it does not add a dependency.
layer_class = 'dropout'[source]

Layer-Normalization Layer

class, **kwargs)[source]

Applies layer-normalization.

layer_class = 'layer_norm'[source]
classmethod get_out_data_from_opts(sources, name, **kwargs)[source]
  • sources (list[LayerBase]) –
  • name (str) –
Return type: