Network

class Network.LayerNetwork(n_in=None, n_out=None, base_network=None, data_map=None, data_map_i=None, shared_params_network=None, mask=None, sparse_input=False, target='classes', train_flag=False, eval_flag=False)[source]
Parameters:
  • n_in (int) – input dim of the network
  • n_out (dict[str,(int,int)]) – output dim of the network. first int is num classes, second int is 1 if it is sparse, i.e. we will get the indices.
  • data_map (dict[str,theano.Variable]) – if specified, this will be used for x/y (and it expects data_map_i)
  • data_map_i (dict[str,theano.Variable]) – if specified, this will be used for i/j
  • base_network (LayerNetwork|None) – optional base network where we will derive x/y/i/j/n_in/n_out from. data_map will have precedence over base_network.
  • shared_params_network (LayerNetwork|()->LayerNetwork|None) – optional network where we will share params with. we will error if there is a param which cannot be shared.
  • mask (str) – e.g. “unity” or None (“dropout”)
  • sparse_input (bool) – for SourceLayer
  • target (str) – default target
  • train_flag (bool) – marks that we are used for training
  • eval_flag (bool) – marks that we are used for evaluation
add_cost_and_constraints(layer)[source]
add_layer(layer)[source]

:rtype NetworkHiddenLayer.Layer

declare_train_params(**kwargs)[source]

Kwargs as in self.get_params(), or default values.

classmethod epoch_from_hdf_model(model)[source]

:returns last epoch the model was trained on :rtype: int

classmethod epoch_from_hdf_model_filename(model_filename)[source]

:returns last epoch the model was trained on :rtype: int

classmethod from_base_network(base_network, json_content=None, share_params=False, base_as_calc_step=False, **kwargs)[source]
Parameters:
  • base_network (LayerNetwork) – base network to derive from
  • json_content (dict[str]|None) – JSON content for subnetwork. if None, will use from base network
  • share_params (bool) – will use the same params as the base network
  • base_as_calc_step (bool) – base is calc step 0. see below
  • kwargs (dict[str]) – kwargs for __init__
Return type:

LayerNetwork

classmethod from_config_topology(config, mask=None, **kwargs)[source]
Parameters:mask (str) – e.g. “unity” or None (“dropout”). “unity” is for testing.
Return type:LayerNetwork
classmethod from_description(description, mask=None, **kwargs)[source]
Parameters:mask (str) – e.g. “unity” or None (“dropout”)
Return type:LayerNetwork
classmethod from_hdf(filename=None, model=None, load_params=True, **kwargs)[source]

Gets the JSON from the hdf file, initializes the network and loads the network params. :param str|None filename: filename of hdf :param h5py.File|None model: hdf, if no filename is provided :param bool load_params: whether to load the params

classmethod from_hdf_model_topology(model, **kwargs)[source]
Return type:LayerNetwork
classmethod from_json(json_content, n_in=None, n_out=None, network=None, **kwargs)[source]
Parameters:| None network (LayerNetwork) – optional already existing instance
Return type:LayerNetwork
classmethod from_json_and_config(json_content, config, **kwargs)[source]
Return type:LayerNetwork
get_all_layers()[source]
get_all_params_vars()[source]
get_calc_step(i)[source]
Parameters:i (int) – calc step, 0 to n
Return type:LayerNetwork

Used by CalcStepLayer. Will automatically create the requested calc step. Calc step 0 is the base network (calc_step_base).

get_layer(layer_name)[source]
get_layer_param(layer_name, param_name, param)[source]

Used by Container.add_param() to maybe substitute a parameter instead of creating a new shared var. :param str layer_name: the layer name where this param will be added :param str param_name: the name of the param :param theano.SharedVariable param: the already created shared var :rtype None | theano.Variable If we return None, Container.add_param() will continue as usual.

get_objective()[source]
get_params_dict()[source]
Return type:dict[str,dict[str,numpy.ndarray|theano.sandbox.cuda.CudaNdArray]]
get_params_shared_flat_dict()[source]
Return type:dict[str,theano.shared]

This will collect all vars of all layers in one dict. We extend the param name with our custom scheme.

get_params_vars(hidden_layer_selection, with_output)[source]
Return type:list[theano.compile.sharedvalue.SharedVariable]

:returns list (with well-defined order) of shared variables

get_train_param_args_default()[source]

:returns default kwargs for self.get_params(), which returns all params with this.

get_used_data_keys()[source]
init_args()[source]
classmethod init_args_from_config(config)[source]
Return type:dict[str]

:returns the kwarg for cls.from_json()

classmethod json_from_config(config, mask=None)[source]
Parameters:mask (str) – “unity”, “none” or “dropout”
Return type:dict[str]
load_hdf(model)[source]

:returns last epoch this was trained on :rtype: int

make_classifier(name='output', target='classes', **kwargs)[source]
Parameters:
new_subnetwork(json_content, n_out, data_map, data_map_i)[source]
Parameters:
  • json_content (dict[str,dict]) – subnetwork specification
  • n_out (dict[str,list[int,int]]) – n_out info for subnetwork
  • data_map (dict[str,theano.Variable]) – data
  • data_map_i (dict[str,theano.Variable]) – indices for data
Return type:

LayerNetwork

The data input for the subnetwork is not derived from ourselves but specified explicitly through n_out & data_map.

num_params()[source]
print_network_info(name='Network')[source]
save_hdf(model, epoch)[source]
set_cost_constraints_and_objective()[source]
set_params_by_dict(params)[source]
to_json()[source]
to_json_content()[source]
use_target(target, dtype)[source]