Utility Layers

Framewise Statistics Layer

class returnn.tf.layers.basic.FramewiseStatisticsLayer(sil_label_idx, histogram_num_bins=20, **kwargs)[source]

Collects various statistics (such as FER, etc) on the sources. The tensors will get stored in self.stats which will be collected by TFEngine.

layer_class = 'framewise_statistics'[source]
classmethod get_out_data_from_opts(**kwargs)[source]
Return type:Data

HDFDumpLayer

class returnn.tf.layers.basic.HDFDumpLayer(filename, extra=None, dump_whole_batches=False, labels=None, extend_existing_file=False, dump_per_run=False, **kwargs)[source]

Dumps into HDF file, compatible to HDFDataset.

The HDF will be written to disk under the specified filename, if there was no error, by default at graph reset, via TFNetwork.register_graph_reset_callback(). Or after the dataset iteration run loop, with dump_per_run, via TFNetwork.register_run_finished_callback().

Common usage would be to add this to your network with “is_output_layer”: True, such that you don’t need to make other layers depend on it.

It currently uses SimpleHDFWriter internally.

Parameters:
  • filename (str|(()->str)) –
  • extra (None|dict[str,LayerBase]) –
  • dump_whole_batches (bool) – dumps the whole batch as a single sequence into the HDF
  • labels (list[str]|None) –
  • extend_existing_file (bool) – True also means we expect that it exists
  • dump_per_run (bool) – write via TFNetwork.register_run_finished_callback()
layer_class = 'hdf_dump'[source]
classmethod get_out_data_from_opts(name, sources, **kwargs)[source]
Parameters:
  • name (str) –
  • sources (list[LayerBase]) –
Return type:

Data

classmethod transform_config_dict(d, network, get_layer)[source]
Parameters:
  • d (dict[str]) – will modify inplace
  • network (returnn.tf.network.TFNetwork) –
  • -> LayerBase) get_layer (((str)) – function to get or construct another layer

Image Summary Layer

class returnn.tf.layers.basic.ImageSummaryLayer(max_outputs=3, **kwargs)[source]

Creates image summaries which can be viewed in TensorBoard. This layer expects the source to be in (T-decoder, T-encoder, B, 1).

Parameters:max_outputs – number of images to generate per step
layer_class = 'image_summary'[source]
classmethod transform_config_dict(d, network, get_layer)[source]
Parameters:
  • d (dict[str]) – will modify inplace, the loss_opts
  • network (returnn.tf.network.TFNetwork) –
  • -> LayerBase) get_layer (((str)) – function to get or construct another layer
classmethod get_out_data_from_opts(**kwargs)[source]
Return type:Data

Scaled Gradient Layer

class returnn.tf.layers.basic.ScaledGradientLayer(scale, **kwargs)[source]

Just tf.identity in the forward pass. Scales the gradient by some factor in backprop. Can be used as gradient reversal layer (with negative factor). Uses TFUtil.scaled_gradient(), or tf.stop_gradient()

Parameters:scale (float) – if 0., will use tf.stop_gradient
layer_class = 'scaled_grad'[source]

Synthetic Gradient Layer

class returnn.tf.layers.basic.SyntheticGradientLayer(gradient, meta_loss_scale=1.0, **kwargs)[source]

This is a generalized way to be able to replace the true gradient with any kind of predicted gradient. This enabled to implement the idea from here:

Decoupled Neural Interfaces using Synthetic Gradients, https://arxiv.org/abs/1608.05343
Parameters:
  • gradient (LayerBase) –
  • meta_loss_scale (float) –
layer_class = 'synthetic_gradient'[source]
classmethod transform_config_dict(d, network, get_layer)[source]
Parameters:
classmethod get_out_data_from_opts(sources, name, **kwargs)[source]
Parameters:
  • sources (list[LayerBase]) –
  • name (str) –
Return type:

Data