# Pretrain¶

class Pretrain.Pretrain(original_network_json, network_init_args, copy_output_layer=None, greedy=None, repetitions=None, construction_algo=None)[source]

Start with 1 hidden layers up to N hidden layers -> N pretrain steps -> N epochs (with repetitions == 1). The first hidden layer is the input layer. This works for generic network constructions. See _construct_epoch().

Parameters: network_init_args (dict[str]) – additional args we use for LayerNetwork.from_json(). must have n_in, n_out. copy_output_layer (bool|str) – whether to copy the output layer params from last epoch or reinit greedy (bool) – if True, only train output+last layer, otherwise train all | int | list[int] | dict repetitions (None) – how often to repeat certain pretrain steps. default is one epoch. It can also be a dict, with keys like ‘default’ and ‘final’. See code below. construction_algo (str) – e.g. “from_output”
copy_params_from_old_network(new_network, old_network)[source]

:returns the remaining hidden layer names which exist only in the new network. :rtype: set[str]

get_final_network_json()[source]
get_network_for_epoch(epoch, mask=None)[source]
Return type: Network.LayerNetwork
get_network_json_for_epoch(epoch)[source]
Parameters: epoch (int) – starting at 1 dict[str]
get_train_num_epochs()[source]
get_train_param_args_for_epoch(epoch)[source]

:returns the kwargs for LayerNetwork.set_train_params, i.e. which params to train. :rtype: dict[str]

class Pretrain.WrapEpochValue(func)[source]

Use this wrapper if you want to define some value in your network which depends on the pretrain epoch. This is going to be part in your network description dict.

get_value(epoch)[source]
Pretrain.demo()[source]
Pretrain.find_pretrain_wrap_values(net_json)[source]
Pretrain.pretrainFromConfig(config)[source]
Return type: Pretrain | None