HyperParamTuning

Here we provide some logic to perform hyper-parameter search. See demos/demo-hyper-param-tuning.config for an example config. For each entry in the config where search should be performed on, you declare it as an instance of HyperParam. Then, this module will find all such instances in the config and replace it with values during search.

The search itself is some evolutionary genetic search. There are many variants to it, e.g. such as what kind of manipulations you, e.g. cross-over and mutation, and also, how you sample new random values. The current logic probably can be improved.

Currently, each search is a training started from the scratch, and the accumulated train score is used as an evaluation measure. This probably also can be improved. Also, instead of always starting from scratch, we could keep intermediate results and resume from them, or use real training intermediate results and resume from them. We could even do some simple search in the beginning of each epoch when we keep it cheap enough.

Also, we could store the population of hyper params on disk to allow resuming of a search.

class HyperParamTuning.HyperParam(dtype=None, bounds=None, classes=None, log=False, default=None)[source]
Parameters:
  • dtype (str|type|None|list) – e.g. “float”, “int” or “bool”, or if Collection, will be classes
  • bounds (None|list[int|float]) – inclusive
  • classes (list|None) –
  • log (bool) – if in log-scale
  • default (float|int|object|None) –
get_canonical_usage()[source]
get_sorted_usages()[source]
description()[source]
get_num_instances(upper_limit=100)[source]
Parameters:upper_limit (int) –
Return type:int
merge_values(value1, value2)[source]

Merge two values, which are valid values for this HyperParam.

Parameters:
  • value1 (T) –
  • value2 (T) –
Return type:

T

get_value(selected, eps=1e-16)[source]
Parameters:
  • selected (float) – must be between 0 and 1
  • eps (float) – if in log-space and you have e.g. bounds=[0,1], will be the lowest value, before 0. see code.
Return type:

float|int|bool|object

get_initial_value()[source]
get_default_value()[source]
get_random_value(seed, eps=1e-16)[source]
Parameters:
  • seed (int) –
  • eps (float) – see get_value()
Return type:

float|int|bool|object

get_random_value_by_idx(iteration_idx, individual_idx)[source]
Parameters:
  • iteration_idx (int) –
  • individual_idx (int) –
Return type:

float|int|bool|object

exception HyperParamTuning.TrainException[source]
class HyperParamTuning.Individual(hyper_param_mapping, name)[source]
Parameters:
  • hyper_param_mapping (dict[HyperParam]) –
  • name (str) –
cross_over(hyper_params, population, random_seed)[source]
Parameters:
Returns:

copy of self, cross-overd with others

Return type:

Individual

class HyperParamTuning.Optimization(config, train_data)[source]
Parameters:
get_population(iteration_idx, num_individuals)[source]
Parameters:
  • iteration_idx (int) –
  • num_individuals (int) –
Return type:

list[Individual]

get_individual(iteration_idx, individual_idx)[source]
Parameters:
  • iteration_idx (int) –
  • individual_idx (int) –
Return type:

Individual

cross_over(population, iteration_idx)[source]
Parameters:
  • population (list[Individual]) – modified in-place
  • iteration_idx (int) –
create_config_instance(hyper_param_mapping, gpu_ids)[source]
Parameters:
  • hyper_param_mapping (dict[HyperParam]) – maps each hyper param to some value
  • gpu_ids (set[int]) –
Return type:

Config

work()[source]
HyperParamTuning.hash_str_djb2(s)[source]
Parameters:s (str) –
Return type:int
HyperParamTuning.hash_seq(ls)[source]
Parameters:ls (list|tuple) –
Return type:int
HyperParamTuning.hash_int(x)[source]
Parameters:x (int) –
Return type:int
HyperParamTuning.hash_obj(x)[source]
Parameters:x (tuple|list|str|_AttribOrKey|_AttrChain) –
Return type:int