pydrobert.param.optuna
Utilities for optimizing param parameters via Optuna
See also
- Hyperparameter Optimization with Optuna
A tutorial on how to use this module
- class pydrobert.param.optuna.TunableParameterized(**params)[source]
Bases:
AbstractParameterized
An interface for Optuna to tune Parameterized instances
The
TunableParameterized
interface requires two class methods:Any object with both is a
TunableParameterized
. Just like incollections.abc
, the class need not directly subclassTunableParameterized
forisinstance()
andissubclass()
to returnTrue
. SubclassingTunableParameterized
directly will ensure the function also inherits fromparam.parameterized.Parameterized
- abstract classmethod get_tunable()[source]
Get a set of names of tunable parameters
The values are intended to be names of parameters. Values should not contain
'.'
.
- abstract classmethod suggest_params(trial, base=None, only=None, prefix='')[source]
Populate an instance of this class with parameters based on trial
- Parameters
trial (optuna.trial.Trial) – The current optuna trial. Parameter values will be sampled from this
base (
Optional
[TunableParameterized
]) – If set, parameter values will be loaded into this instance. IfNone
, a new instance will be created matching this class typeonly (
Optional
[Collection
[str
]]) – Only sample parameters with names in this set. IfNone
, all the parameters fromget_tunable()
will be sampledprefix (
str
) – A value to be prepended to the names from only when sampling those parameters from trial
- Returns
TunableParameterized – Either base if not
None
, or a new instance of this class with parameters matching sampled values
- pydrobert.param.optuna.get_param_dict_tunable(param_dict, on_decimal='warn')[source]
Return a set of all the tunable parameters in a parameter dictionary
This function crawls through a (possibly nested) dictionary of objects, looks for any that implement the
TunableParameterized
interface, collects the results of calls toget_tunable()
, and returns the set tunable.Elements of tunable are strings with the format
"<key_0>.<key_1>.<...>.<parameter_name>"
, whereparameter_name
is a parameter fromparam_dict[<key_0>][<key_1>][...].get_tunable()
- Parameters
param_dict (
dict
) –on_decimal (
Literal
[‘ignore’, ‘warn’, ‘raise’]) –'.'
can produce ambiguous parameters in tunable. When one is found as a key in param_dict or as a tunable parameter: “raise” means aValueError
will be raised; “warn” means a warning will be issued viawarnings
; and “ignore” just ignores it
- Returns
tunable (collections.OrderedDict)
- pydrobert.param.optuna.parameterized_class_from_tunable(tunable, base=<class 'param.parameterized.Parameterized'>, default=[])[source]
Construct a Parameterized class to store parameters to optimize
This function creates a subclass of
param.parameterized.Parameterized
that has only one parameter: only. only is aparam.ListSelector
that allows values from tunable- Parameters
tunable (
Collection
) –base (
Type
[TypeVar
(P
, bound=Parameterized
)]) – The parent class of the returned classdefault (
list
) – The default value for the only parameter
- Returns
Derived (base) – The derived class
Examples
Group what hyperparameters you wish to optimize in the same param dict as the rest of your parameters
>>> class ModelParams(param.Parameterized): >>> lr = param.Number(1e-4, bounds=(1e-8, None)) >>> num_layers = param.Integer(3, bounds=(1, None)) >>> @classmethod >>> def get_tunable(cls): >>> return {'num_layers', 'lr'} >>> @classmethod >>> def suggest_params(cls, trial, base=None, only=None, prefix=None): >>> pass # do this >>> >>> param_dict = {'model': ModelParams()} >>> tunable = get_param_dict_tunable(param_dict) >>> OptimParams = parameterized_class_from_tunable(tunable) >>> param_dict['hyperparameter_optimization'] = OptimParams()
- pydrobert.param.optuna.suggest_param_dict(trial, global_dict, only=None, on_decimal='warn', warn=True)[source]
Use Optuna trial to sample values for TunableParameterized in dict
This function creates a deep copy of the dictionary global_dict. Then, for every
TunableParameterized
it finds in the copy, it calls that instance’ssuggest_params()
to optimize an appropriate subset of parameters.- Parameters
trial (optunal.trial.Trial) – The trial from an Optuna experiment. This is passed along to each
TunableParameterized
in global_dictglobal_dict (
dict
) – A (possibly nested) dictionary containing someTunableParameterized
as valuesonly (
Optional
[Set
[str
]]) – A set containing parameter names to optimize. Names are formatted"<key_0>.<key_1>.<...>.<parameter_name>"
, whereparameter_name
is a parameter fromglobal_dict[<key_0>][<key_1>][...].get_tunable()
. IfNone
, the entire set returned byget_param_dict_tunable()
.on_decimal (
Literal
[‘ignore’, ‘warn’, ‘raise’]) – ‘.’ can produce ambiguous parameters in only. When one is found as a key in global_dict or as a tunable parameter: “raise” means aValueError
will be raised; “warn” means a warning will be issued viawarnings
; and “ignore” just ignores itwarn (
bool
) – If warn isTrue
and any elements of only do not match this description, a warning will be raised viawarnings
- Returns
param_dict (dict)