Utility Layers

Framewise Statistics Layer

class TFNetworkLayer.FramewiseStatisticsLayer(sil_label_idx, histogram_num_bins=20, **kwargs)[source]

Collects various statistics (such as FER, etc) on the sources. The tensors will get stored in self.stats which will be collected by TFEngine.

layer_class = 'framewise_statistics'[source]
classmethod get_out_data_from_opts(**kwargs)[source]
Return type:Data

Dropout Layer

class TFNetworkLayer.DropoutLayer(extra_deps=(), **kwargs)[source]

Just the same as CopyLayer, because that one already supports dropout.

Parameters:extra_deps (list[LayerBase]) – Just add as an additional dependency, without really using it. This can have an effect though on the search beam, via SelectSearchSourcesLayer. We only have this here for the CopyLayer because the get_out_data_from_opts() must know about it and define the right beam. Also see the option collocate_with, which is different in that it does not add a dependency.
layer_class = 'dropout'[source]

Image Summary Layer

class TFNetworkLayer.ImageSummaryLayer(max_outputs=3, **kwargs)[source]

Creates image summaries which can be viewed in TensorBoard. This layer expects the source to be in (T-decoder, T-encoder, B, 1).

Parameters:max_outputs – number of images to generate per step
layer_class = 'image_summary'[source]
classmethod transform_config_dict(d, network, get_layer)[source]
Parameters:
  • d (dict[str]) – will modify inplace, the loss_opts
  • network (TFNetwork.TFNetwork) –
  • -> LayerBase) get_layer (((str)) – function to get or construct another layer
classmethod get_out_data_from_opts(**kwargs)[source]
Return type:Data

Synthetic Gradient Layer

class TFNetworkLayer.SyntheticGradientLayer(gradient, meta_loss_scale=1.0, **kwargs)[source]

This is a generalized way to be able to replace the true gradient with any kind of predicted gradient. This enabled to implement the idea from here:

Decoupled Neural Interfaces using Synthetic Gradients, https://arxiv.org/abs/1608.05343
Parameters:
  • gradient (LayerBase) –
  • meta_loss_scale (float) –
layer_class = 'synthetic_gradient'[source]
classmethod transform_config_dict(d, network, get_layer)[source]
Parameters:
classmethod get_out_data_from_opts(sources, name, **kwargs)[source]
Parameters:
  • sources (list[LayerBase]) –
  • name (str) –
Return type:

Data

HDFDumpLayer

class TFNetworkLayer.HDFDumpLayer(filename, extra=None, dump_whole_batches=False, labels=None, **kwargs)[source]

Dumps into HDF file, compatible to HDFDataset.

Common usage would be to add this to your network with “is_output_layer”: True, such that you don’t need to make other layers depend on it.

Parameters:
  • filename (str) –
  • extra (None|dict[str,LayerBase]) –
  • dump_whole_batches (bool) – dumps the whole batch as a single sequence into the HDF
  • labels (list[str]|None) –
layer_class = 'hdf_dump'[source]
classmethod get_out_data_from_opts(name, sources, **kwargs)[source]
Parameters:
  • name (str) –
  • sources (list[LayerBase]) –
Return type:

Data

classmethod transform_config_dict(d, network, get_layer)[source]
Parameters:
  • d (dict[str]) – will modify inplace
  • network (TFNetwork.TFNetwork) –
  • -> LayerBase) get_layer (((str)) – function to get or construct another layer