optuna

Utilities for optimizing param parameters via Optuna

See also

Hyperparameter Optimization with Optuna

A tutorial on how to use this module

class pydrobert.param.optuna.TunableParameterized(*, name)[source]

An interface for Optuna to tune Parameterized instances

The TunableParameterized interface requires two class methods:

Any object with both is a TunableParameterized. Just like in collections.abc, the class need not directly subclass TunableParameterized for isinstance() and issubclass() to return True. Subclassing TunableParameterized directly will ensure the function also inherits from param.parameterized.Parameterized

abstract classmethod get_tunable()[source]

Get a set of names of tunable parameters

The values are intended to be names of parameters. Values should not contain '.'.

abstract classmethod suggest_params(trial, base=None, only=None, prefix='')[source]

Populate an instance of this class with parameters based on trial

Parameters:
  • trial (optuna.trial.Trial) – The current optuna trial. Parameter values will be sampled from this

  • base (Optional[TunableParameterized]) – If set, parameter values will be loaded into this instance. If None, a new instance will be created matching this class type

  • only (Optional[Collection[str]]) – Only sample parameters with names in this set. If None, all the parameters from get_tunable() will be sampled

  • prefix (str) – A value to be prepended to the names from only when sampling those parameters from trial

Returns:

TunableParameterized – Either base if not None, or a new instance of this class with parameters matching sampled values

pydrobert.param.optuna.get_param_dict_tunable(param_dict, on_decimal='warn')[source]

Return a set of all the tunable parameters in a parameter dictionary

This function crawls through a (possibly nested) dictionary of objects, looks for any that implement the TunableParameterized interface, collects the results of calls to get_tunable(), and returns the set tunable.

Elements of tunable are strings with the format "<key_0>.<key_1>.<...>.<parameter_name>", where parameter_name is a parameter from param_dict[<key_0>][<key_1>][...].get_tunable()

Parameters:
  • param_dict (dict) –

  • on_decimal (Literal['ignore', 'warn', 'raise']) – '.' can produce ambiguous parameters in tunable. When one is found as a key in param_dict or as a tunable parameter: “raise” means a ValueError will be raised; “warn” means a warning will be issued via warnings; and “ignore” just ignores it

Returns:

tunable (collections.OrderedDict)

pydrobert.param.optuna.parameterized_class_from_tunable(tunable, base=<class 'param.parameterized.Parameterized'>, default=[])[source]

Construct a Parameterized class to store parameters to optimize

This function creates a subclass of param.parameterized.Parameterized that has only one parameter: only. only is a param.ListSelector that allows values from tunable

Parameters:
Returns:

Derived (base) – The derived class

Examples

Group what hyperparameters you wish to optimize in the same param dict as the rest of your parameters

>>> class ModelParams(param.Parameterized):
>>>     lr = param.Number(1e-4, bounds=(1e-8, None))
>>>     num_layers = param.Integer(3, bounds=(1, None))
>>>     @classmethod
>>>     def get_tunable(cls):
>>>         return {'num_layers', 'lr'}
>>>     @classmethod
>>>     def suggest_params(cls, trial, base=None, only=None, prefix=None):
>>>         pass  # do this
>>>
>>> param_dict = {'model': ModelParams()}
>>> tunable = get_param_dict_tunable(param_dict)
>>> OptimParams = parameterized_class_from_tunable(tunable)
>>> param_dict['hyperparameter_optimization'] = OptimParams()
pydrobert.param.optuna.suggest_param_dict(trial, global_dict, only=None, on_decimal='warn', warn=True)[source]

Use Optuna trial to sample values for TunableParameterized in dict

This function creates a deep copy of the dictionary global_dict. Then, for every TunableParameterized it finds in the copy, it calls that instance’s suggest_params() to optimize an appropriate subset of parameters.

Parameters:
  • trial (optunal.trial.Trial) – The trial from an Optuna experiment. This is passed along to each TunableParameterized in global_dict

  • global_dict (dict) – A (possibly nested) dictionary containing some TunableParameterized as values

  • only (Optional[Set[str]]) – A set containing parameter names to optimize. Names are formatted "<key_0>.<key_1>.<...>.<parameter_name>", where parameter_name is a parameter from global_dict[<key_0>][<key_1>][...].get_tunable(). If None, the entire set returned by get_param_dict_tunable().

  • on_decimal (Literal['ignore', 'warn', 'raise']) – ‘.’ can produce ambiguous parameters in only. When one is found as a key in global_dict or as a tunable parameter: “raise” means a ValueError will be raised; “warn” means a warning will be issued via warnings; and “ignore” just ignores it

  • warn (bool) – If warn is True and any elements of only do not match this description, a warning will be raised via warnings

Returns:

param_dict (dict)