returnn.frontend.parametrizations
¶
Parameterizations using the parametrization API (register_parametrization()
).
Also see: https://github.com/rwth-i6/returnn/issues/1518 https://pytorch.org/tutorials/intermediate/parametrizations.html
- returnn.frontend.parametrizations.weight_dropout(module: Module, param_name: str, *, drop_prob: float) Module [source]¶
Apply weight dropout to a parameter of a module.
This is only done in training.
It uses
gradient_checkpoint_scope()
to avoid any memory overhead.In RETURNN TF-layers, this corresponds to the
param_dropout
option in a layer. Or in the RETURNN TF-layersRecLayer
with ùnit=”NativeLstm2”`, this was therec_weight_dropout
option.- Parameters:
module
param_name – name of the parameter
drop_prob – dropout probability
- Returns:
module
- class returnn.frontend.parametrizations.WeightDropout(drop_prob: float)[source]¶
Use this for
register_parametrization()
, or viaweight_dropout()
.
- returnn.frontend.parametrizations.weight_noise(module: Module, param_name: str, *, std: float) Module [source]¶
Apply weight noise to a parameter of a module. This is also called variational noise.
This is only done in training.
It uses
gradient_checkpoint_scope()
to avoid any memory overhead.In RETURNN TF-layers, this corresponds to the
param_variational_noise
option in a layer.- Parameters:
module
param_name – name of the parameter
std – standard deviation of the noise
- Returns:
module
- class returnn.frontend.parametrizations.WeightNoise(std: float)[source]¶
Use this for
register_parametrization()
, or viaweight_noise()
.