Optimizers (optimizers package)¶
To perform demographic inference optimization should be launched. There are different optimizers in GADMA: local search and global search algorithms.
Base Classes¶
Module gadma.optimizers.optimizer
contains several base classes of optmizers.
-
class
gadma.optimizers.optimizer.
ConstrainedOptimizer
(log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.optimizer.ContinuousOptimizer
Base class for constrained optimization, i.e. when values of variables have some bounds.
-
check_variables
(variables)¶ Returns True if all variables have constrained domain.
-
-
class
gadma.optimizers.optimizer.
ContinuousOptimizer
(log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.optimizer.Optimizer
Base class for optimization of continuous variables.
-
check_variables
(variables)¶ Returns True if all variables are instances of
gadma.utils.ContinousVariable
class.
-
-
class
gadma.optimizers.optimizer.
Optimizer
(log_transform=False, maximize=False)¶ Bases:
object
Base class for optimizer. The most important methods:
gadma.optimizers.Optimizer.evaluate()
andgadma.optimizers.Optimizer.optimize()
.To create new class for optimizer one should at least implement
gadma.optimizers.Optimizer.valid_restore_file()
andgadma.optimizers.Optimizer.optimize()
- Parameters
log_transform (bool) – If True then all parameters are optimized in log scale.
maximize (bool) – If True then maximization of target function is performed.
-
check_variables
(variables)¶ Checks that all variables are instances of
gadma.utils.Variable
class.
-
evaluate
(f, x, args=(), linear_constrain=None)¶ Evaluates function f on values x.
- Parameters
f – Target function.
x – Value of parameters of f.
args – Other arguments of f. f(x, args).
linear_constrain – Linear constrain on x.
-
load
(save_file)¶ Loads information that was saved by
save()
method.- Parameters
save_file – File to restore information from.
- Note
In base class method just loads from save_file with pickle.
-
property
log_transform
¶
-
optimize
(variables, args=(), options={}, linear_constrain=None, maxiter=None, maxeval=None, verbose=0, callback=None, report_file=None, eval_file=None, save_file=None, restore_file=None, restore_points_only=False, restore_x_transform=None)¶ Run optimization for target function.
- Parameters
f (func) – Target function to optimize.
variables (
gadma.utils.VariablePool
) – List of variables which values are optimized.args (tuple) – Additional arguments of target function.
options (dict) – Additional options kwargs for optimization.
linear_constrain (
gadma.optimizers.LinearConstrain
) – Linear constrain on optimized variables. It is optional argument. Could be missed in unconstrained optimizers.maxiter (int) – Maximum number of iterations to run. If None then run until converge.
maxeval (int) – Maximum number of evaluations to run. If None then run until converge.
verbose (int) – Verbosity of the output. If 0 then no reports.
callback (function) – Callback to run after each iteration of optimization. Should be called as callback(x, y)
report_file (str) – File to save report. Check option verbose.
eval_file (str) – File to save all evaluations of the function f.
save_file (str) – File to save information during optimization for its reconstruction.
restore_file (str) – File to restore previous run.
restore_points_only (bool) – Restore point/points from previous run and run optimization from them once more. If False then previous run will be resumed.
restore_x_transform (function) – Restore points but transform them before usage in this run.
-
prepare_callback
(callback)¶ Wraps callback function for usage inside optimizer. It transforms x according to log_transform and multiply y to -1 if maximize is True.
- Parameters
callback – function of callback that has the following notation: callback(x, y), where x is value of parameters and y is the value of target function on x.
-
prepare_f_for_opt
(f, args=(), cache=True)¶ Prepares f for usage in optimizer. It should be transformed according to log_transform and maximize. Arguments are fixed and it could be cached or not.
- Parameters
f – Target function to work with.
args (tuple) – Arguments of the function. f(x, args).
cache (bool.) – If True then function is cached.
-
save
(info, save_file)¶ Save some information into file. Is supposed to save info during optimization in order to restore it.
- Parameters
info – Information to dump.
save_file – File to save information.
- Note
if save_file is None then nothing will be done. In base class method just dumps info to save_file with pickle.
-
property
sign
¶ Returns -1 if maximization and 1 if minimization of target function.
-
valid_restore_file
(save_file)¶ Checks that save_file contains valid information and it could be restored from it.
- Parameters
save_file – File to check.
-
wrap_for_report
(f, variables, verbose, report_file)¶ Wraps function f for automatically report output. When function is called report is saved to report_file every verbose call.
- Parameters
f – Function to wrap.
variables – Variables of function f.
verbose – Verbosity level.
report_file – Filename to save report.
-
write_report
(n_iter, variables, x, y, report_file)¶ Writes report of optimizer to file or stdout.
- Parameters
n_iter – Number of iteration of optimization.
variables – list of variables which values are optimized.
x – Values of variables.
y – Value of target function on x.
report_file – filename to write report. If None then report is printed to stdout.
-
class
gadma.optimizers.optimizer.
UnconstrainedOptimizer
(log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.optimizer.ContinuousOptimizer
Base class for unconstrained optimization, i.e. when values of variables have no bounds.
-
check_variables
(variables)¶ Returns True if all variables have domain of [-inf, inf].
-
Global optimizers¶
Module gadma.optimizers.global_optimizer
contains base class for global optimizers.
Additional global optimizer could be implemented by creating new subclass of class gadma.optimizers.GlobalOptimizer
and register it with function gadma.optimizers.register_global_optimizer()
.
Registered global optimizers¶
The following optimizers are registered:
ID |
Description |
Instance of |
---|---|---|
“Genetic_algorithm” |
Genetic algorithm optimization |
|
“Bayesian_optimization” |
Bayesian optimization |
|
-
class
gadma.optimizers.global_optimizer.
GlobalOptimizer
(log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.optimizer.Optimizer
Base class for global optimization. See
gadma.optimizers.Optimizer
for more information.-
initial_design
(f, variables, num_init, X_init=None, Y_init=None, random_type='resample', custom_rand_gen=None)¶ Performs initial design for optimization.
- Parameters
f – function to use for evaluations. Note that it should be without arguments. Use
self.fix_f_and_args()
to get such function from another one with arguments.variables – variables of function. They are used for random generation of their values.
num_init – number of initial solutions.
X_init – list of some initial solutions.
Y_init – list of function values on the initial solutions.
- Returns
pair of lists X and Y. Initial points and value of fitness function on them.
-
optimize
(f, variables, num_init, X_init=None, Y_init=None, args=(), options={}, linear_constrain=None, maxiter=None)¶ Runs optimization.
- Parameters
f (func) – Target function to optimize.
variables (
gadma.utils.VariablePool
) – Variables of f which values should be optimized.num_init (int) – Number of points in initial design.
X_init (list) – List of initial points.
Y_init (list) – Values of target function on initial points.
args (tuple) – Additional arguments of target function.
options (dict) – Additional options kwargs for optimization.
linear_constrain (
gadma.optimizers.LinearConstrain
) – Linear constrain on optimized variables.maxiter (int) – Maximum number of iterations to run.
-
randomize
(variables, random_type='resample', custom_rand_gen=None)¶ Generate random solution. The type of generation could be set to one of three operators:
‘uniform’ - uniform over domain.
‘resample’ - call
resample()
method for all variables.‘custom’ - sample values of parameters from custom_rand_gen.
-
-
gadma.optimizers.global_optimizer.
all_global_optimizers
()¶ Returns an iterator over all registered global optimizers.
-
gadma.optimizers.global_optimizer.
get_global_optimizer
(id)¶ Returns the global optimizer with the specified id.
-
gadma.optimizers.global_optimizer.
register_global_optimizer
(id, optimizer)¶ Registers the specified global optimizer.
Genetic algorithm¶
-
class
gadma.optimizers.genetic_algorithm.
GeneticAlgorithm
(gen_size=10, n_elitism=2, p_mutation=0.3, p_crossover=0.3, p_random=0.2, mut_strength=0.2, const_mut_strength=1.1, mut_rate=0.2, const_mut_rate=1.2, mut_attempts=2, eps=0.01, n_stuck_gen=100, selection_type='roulette_wheel', selection_random=False, mutation_type='gaussian', one_fifth_rule=True, crossover_type='uniform', crossover_k=None, random_type='resample', custom_rand_gen=None, log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.global_optimizer.GlobalOptimizer
,gadma.optimizers.optimizer.ConstrainedOptimizer
Class for Genetic Algorithm.
- Parameters
gen_size (int) – Size of generation of genetic algorithm. That is number of individuals/solutions on each step of GA.
n_elitism (int) – Number of best models from previous generation in GA that will be taken to new iteration.
p_mutation (float) – probability of mutation in one generation of GA.
p_crossover (float) – probability of crossover in one generation of GA.
p_random (float) – Probability of random generated individual in one generation of GA.
mut_rate (float) – Initial mean mutation rate.
mut_strength (float) – initial mutation “strength” - mean fraction of model parameters that will be mutated.
const_mut_rate (float) – constant to change mutation rate according to one-fifth algorithm. Check GADMA paper for more information.
eps (float) – const for model’s log likelihood compare. Model is better if its log likelihood is greater than log likelihood of another model by epsilon.
n_stuck_gen (int) – Number of iterations for GA stopping: GA stops when it can’t improve model during n_stuck_gen generations.
selection_type (bool) – Type of selection operator in GA. Could be: * ‘roulette_wheel’ * ‘rank’ See help(GeneticAlgorithm.selection) for more information.
selection_random – If True then number of mutants and crossover’s offsprings in new generation will be binomial random variable.
mutation_type (str) – Type of mutation operator in GA. Could be: * ‘uniform’ * ‘resample’ * ‘gaussian’ See help(GeneticAlgorithm.mutation) for more information.
one_fifth_rule (bool) – If True then one fifth rule is used in mutation.
crossover_type (str) – Type of crossover operator in GA. Could be: * ‘k-point’ * ‘uniform’ See help(GeneticAlgorithm.crossover) for more information.
crossover_k (int) – k for ‘k-point’ crossover type.
random_type (str) – Type of random generation of new offsprings. Could be: * ‘uniform’ * ‘resample’ * ‘custom’ See help(GlobalOptimizer.randomize) for more information.
custom_rand_gen (func) – Random generator for ‘custom’ random_type. Provide generator from variables: custom_rand_gen(variables) = values
log_transform (bool) – If True then logarithm will be used incide for parameters.
maximize (bool) – If True then optimization will maximize function.
-
check_x
(variables, x, raises=False)¶
-
crossover
(parent1, parent2, variables, crossover_type='uniform', k=2, one_child=True)¶ Crossover operator in genetic algorithm. Could be of two types:
‘k-point’ - k points will be chosen among the vector and each part between those points will be taken from parent1 or parent2 (swapping) . By default k=2.
‘uniform’ - each parameter will be taken from either parent with equal probability.
- Parameters
parent1 – array of first parent.
parent2 – array of second parent.
crossover_type (str) – type of crossover operator. Could be ‘k-point’ or ‘uniform’.
k (int) – value of k for ‘k_point’ crossover.
one_child (bool) – if True then one child will be generated and returned.
-
id
= 'Genetic_algorithm'¶
-
is_stopped
(n_gen, n_eval, impr_gen=None, maxiter=None, maxeval=None, ret_status=False)¶ Returns if genetic algorithm must stop.
- Parameters
n_gen – current number of generations.
n_eval – current number of function evaluations.
impr_gen – number of last generation that improved value of fitness function.
maxiter – maximum number of generations.
maxeval – maximum number of evaluation.
ret_status – If True then return status and message.
-
load
(save_file)¶ Load some values of genetic algorithm from file.
-
mutation
(x, variables, mutation_type='gaussian', one_fifth_rule=True, attemts=1)¶ Mutation operator in genetic algorithm of values x of variables variables. The number of parameters to mutate will be sampled from binomial distribution with mean equal to mutation strength. The type of change of chosen parameters could be set to one of three operators:
‘uniform’ - new values will be sampled uniformly between bounds.
‘resample’ - new values will be sampled from the random distribution of the variables.
‘gaussian’ - will adds a unit Gaussian distributed random value. The mean of the Gaussian distribution will be taken from the mutation rate.
- Parameters
x – values to mutate.
variables – variables.
mutation_type (str) – type of mutation operator. Could be ‘gaussian’, ‘uniform’ and ‘resample’.
fval – max number of function evaluations.
one_fifth_rule – If True then one fifth rule will be used. For ‘gaussian’ option only.
weights – weights for parameters in x, the greater weight is the greater probability to change it is.
attemts (int) – number of mutation attemts.
- Returns
a mutated offspring. If attempts > 1 then a list of mutated offsprings. All offsprings have information about the number of changes of each parameter in weights attribute.
-
mutation_by_ind
(x, variables, index, mutation_type='gaussian', one_fifth_rule=True)¶ Mutation of x in index index. For more information see
mutation()
.
-
optimize
(f, variables, args=(), num_init=50, X_init=None, Y_init=None, linear_constrain=None, maxiter=None, maxeval=None, verbose=0, callback=None, report_file=None, eval_file=None, save_file=None, restore_file=None, restore_points_only=False, restore_x_transform=None)¶ Return best values of variables that minimizes/maximizes the function f.
- Parameters
f – function to minimize/maximize. The usage must be the following: f(x, *args), where x is list of values.
variables – list of variables (instances of
gadma.Variable
class) of the function.X_init – list of initial values.
Y_init – value of function f on initial values from X_init.
args – arguments of function f.
maxiter – maximum number of genetic algorithm’s generations.
maxeval – maximum number of function evaluations.
callback – callback to call after each generation. It will be called as callback(x, y), where x, y - best_solution of generation and its fitness.
-
randomize
(variables, random_type='resample', custom_rand_gen=None)¶ Generate random solution. The type of generation could be set to one of three operators:
‘uniform’ - uniform over domain.
‘resample’ - call
resample()
method for all variables.‘custom’ - sample values of parameters from custom_rand_gen.
-
save
(n_gen, n_eval, n_impr_gen, X_gen, Y_gen, X_total, Y_total, save_file)¶ Save some values of genetic algorithm to file.
-
selection
(f, variables, X_gen, Y_gen=None, selection_type='roulette_wheel', selection_random=False)¶ Perform selection in genetic algorithm. Selection could be of different types:
Roulette Wheel - the better fitness function is the higher chance to be selected for mutation and crossover for the individual is.
Rank - almost the same as Roulette Wheel but with rank insted fitness function. This means weight=1 for the best individual, weight=2 for the second best and so on.
- Parameters
X_gen – previous generation of individuals.
Y_gen – fitnesses of the previous generation. If None then will be evaluated.
selection_type – type of selection. Could be ‘roulette_wheel’ or ‘rank’.
selection_random – if True then number of mutants and crossover’s offsprings in new generation will be binomial random variable.
- Returns
new generation and its fitnesses.
-
valid_restore_file
(save_file)¶ Checks that save_file contains valid information and it could be restored from it.
- Parameters
save_file – File to check.
-
write_report
(n_gen, variables, X_gen, Y_gen, x_best, y_best, mean_time, report_file)¶ Write report about one generation in report file.
- Note
All values are reported as is, i.e. X_gen, x_best should be already translated from log scale if optimization did so; Y_gen and y_best must be already multiplied by -1 if we have maximization instead of minimization.
Bayesian optimization¶
-
class
gadma.optimizers.bayesian_optimization.
BayesianOptimizer
(kernel='Matern52', ARD=True, acquisition_type='MPI', log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.global_optimizer.GlobalOptimizer
,gadma.optimizers.optimizer.ConstrainedOptimizer
Class for Bayesian optimization
-
get_domain
(variables)¶
-
get_kernel
(ndim)¶
-
gpyopt_inv_transform
(x)¶
-
gpyopt_transform
(x)¶
-
id
= 'Bayesian_optimization'¶
-
optimize
(f, variables, args=(), num_init=10, X_init=None, Y_init=None, linear_constrain=None, maxiter=100, maxeval=100, verbose=0, callback=None, report_file=None, eval_file=None, save_file=None)¶ Return best values of variables that minimizes/maximizes the function f.
- Parameters
f – function to minimize/maximize. The usage must be the following: f(x, *args), where x is list of values.
variables – list of variables (instances of
gadma.Variable
class) of the function.X_init – list of initial values.
Y_init – value of function f on initial values from X_init.
args – arguments of function f.
maxiter – maximum number of genetic algorithm’s generations.
maxeval – maximum number of function evaluations.
callback – callback to call after each generation. It will be called as callback(x, y), where x, y - best_solution of generation and its fitness.
-
write_report
(bo_obj, report_file, x, y)¶ Writes report about each iteration in file or stdout.
- Parameters
bo_obj (GPyOpt.methods.BayesianOptimization) – Object of Bayesian Optimization.
report_file – file to write report to. If None then to stdout.
x – Current solution
y – Current value of fitness function
-
Local optimizers¶
Module gadma.optimizers.local_optimizer
contains classes for local serach optimizers.
Additional local optimizer could be implemented by creating new subclass of class gadma.optimizers.local_optimizer.LocalOptimizer
and register it with function gadma.optimizers.local_optimizer.register_local_optimizer()
.
Registered local optimizers¶
The following optimizers are registered:
ID |
Description |
Instance of |
---|---|---|
None or “None” |
None optimization is run |
|
“L-BFGS-B” |
L-BFGS-B from scipy |
|
“L-BFGS-B_log” |
L-BFGS-B from scipy with log transform of values |
|
“BFGS” |
Constrained BFGS from scipy |
|
“BFGS_log” |
Constrained BFGS from scipy with log transform of values |
|
“Powell” |
Constrained Powell’s method from scipy |
|
“Powell_log” |
Constrained Powell’s method from scipy with log transform of values |
|
“Nelder-Mead” |
Constrained Nelder-Mead method from scipy |
|
“Nelder-Mead_log” |
Constrained Nelder-Mead method from scipy with log transform of values |
|
-
class
gadma.optimizers.local_optimizer.
LocalOptimizer
(log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.optimizer.Optimizer
Base class for local optimization. See
gadma.optimizers.Optimizer
for more information.-
optimize
(f, variables, x0, args=(), options={}, maxiter=None, maxeval=None, verbose=0, callback=None, report_file=None, eval_file=None, save_file=None, restore_file=None, restore_points_only=False, restore_x_transform=None)¶ Run optimization of local search algorithm.
- Parameters
f (func) – Target function to optimize.
variables (
gadma.utils.VariablePool
) – Variables of f which values should be optimized.x0 (list) – Initial point to start optimization.
args (tuple) – Additional arguments of target function.
options (dict) – Additional options kwargs for optimization.
maxiter (int) – Maximum number of iterations to run.
maxeval (int) – Maximum number of evaluations to run. If None then run until converge.
verbose (int) – Verbosity of the output. If 0 then no reports.
callback (function) – Callback to run after each iteration of optimization. Should be called as callback(x, y)
report_file (str) – File to save report. Check option verbose.
eval_file (str) – File to save all evaluations of the function f.
save_file (str) – File to save information during optimization for its reconstruction.
restore_file (str) – File to restore previous run.
restore_points_only (bool) – Restore point/points from previous run and run optimization from them once more. If False then previous run will be resumed.
restore_x_transform (function) – Restore points but transform them before usage in this run.
-
-
class
gadma.optimizers.local_optimizer.
ManuallyConstrOptimizer
(optimizer, log_transform=False)¶ Bases:
gadma.optimizers.local_optimizer.LocalOptimizer
,gadma.optimizers.optimizer.ConstrainedOptimizer
Class for Constrained optimization that uses unconstrained optimization. The value of target function is considered to be inf outside bounds.
- Parameters
optimizer (class:gadma.optimizers.UnconstrainedOptimizer) – Unconstrained optimizer to use.
log_transform – If True then log scale is used for optimization. Be careful there could be log_transform already in optimizer.
-
evaluate_inner
(f, x, bounds, args=())¶
-
property
id
¶
-
property
maximize
¶
-
optimize
(f, variables, x0, args=(), options={}, linear_constrain=None, maxiter=None, maxeval=None, verbose=0, callback=None, eval_file=None, report_file=None, save_file=None, restore_file=None, restore_points_only=False, restore_x_transform=None)¶ Run optimization of local search algorithm.
- Parameters
f (func) – Target function to optimize.
variables (
gadma.utils.VariablePool
) – Variables of f which values should be optimized.x0 (list) – Initial point to start optimization.
args (tuple) – Additional arguments of target function.
options (dict) – Additional options kwargs for optimization.
maxiter (int) – Maximum number of iterations to run.
maxeval (int) – Maximum number of evaluations to run. If None then run until converge.
verbose (int) – Verbosity of the output. If 0 then no reports.
callback (function) – Callback to run after each iteration of optimization. Should be called as callback(x, y)
report_file (str) – File to save report. Check option verbose.
eval_file (str) – File to save all evaluations of the function f.
save_file (str) – File to save information during optimization for its reconstruction.
restore_file (str) – File to restore previous run.
restore_points_only (bool) – Restore point/points from previous run and run optimization from them once more. If False then previous run will be resumed.
restore_x_transform (function) – Restore points but transform them before usage in this run.
-
prepare_callback
(callback)¶ Wraps callback function for usage inside optimizer. It transforms x according to log_transform and multiply y to -1 if maximize is True.
- Parameters
callback – function of callback that has the following notation: callback(x, y), where x is value of parameters and y is the value of target function on x.
-
valid_restore_file
(save_file)¶ Checks that save_file contains valid information and it could be restored from it.
- Parameters
save_file – File to check.
-
class
gadma.optimizers.local_optimizer.
NoneOptimizer
(log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.local_optimizer.LocalOptimizer
Class that inherits
gadma.optimizers.LocalOptimizer
but do not run any optimization.-
optimize
(f, variables, x0, args=(), options={}, linear_constrain=None, maxiter=None, maxeval=None, verbose=0, callback=None, eval_file=None, report_file=None, save_file=None, restore_file=None, restore_points_only=False, restore_x_transform=None)¶ Run optimization of local search algorithm.
- Parameters
f (func) – Target function to optimize.
variables (
gadma.utils.VariablePool
) – Variables of f which values should be optimized.x0 (list) – Initial point to start optimization.
args (tuple) – Additional arguments of target function.
options (dict) – Additional options kwargs for optimization.
maxiter (int) – Maximum number of iterations to run.
maxeval (int) – Maximum number of evaluations to run. If None then run until converge.
verbose (int) – Verbosity of the output. If 0 then no reports.
callback (function) – Callback to run after each iteration of optimization. Should be called as callback(x, y)
report_file (str) – File to save report. Check option verbose.
eval_file (str) – File to save all evaluations of the function f.
save_file (str) – File to save information during optimization for its reconstruction.
restore_file (str) – File to restore previous run.
restore_points_only (bool) – Restore point/points from previous run and run optimization from them once more. If False then previous run will be resumed.
restore_x_transform (function) – Restore points but transform them before usage in this run.
-
save
(x0, y, save_file)¶ Save some information into file. Is supposed to save info during optimization in order to restore it.
- Parameters
info – Information to dump.
save_file – File to save information.
- Note
if save_file is None then nothing will be done. In base class method just dumps info to save_file with pickle.
-
valid_restore_file
(save_file)¶ Checks that save_file contains valid information and it could be restored from it.
- Parameters
save_file – File to check.
-
-
class
gadma.optimizers.local_optimizer.
ScipyConstrOptimizer
(method, log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.local_optimizer.ScipyOptimizer
,gadma.optimizers.optimizer.ConstrainedOptimizer
-
get_addit_scipy_kwargs
(variables)¶
-
maxeval_kwarg
= {'L-BFGS-B': 'maxfun'}¶
-
opt_type
= 'constrained'¶
-
scipy_methods
= ['L-BFGS-B', 'TNC', 'SLSQP']¶
-
-
class
gadma.optimizers.local_optimizer.
ScipyOptimizer
(method, log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.local_optimizer.LocalOptimizer
Class of Scipy local search algorithms.
- Variables
ScipyOptimizer.scipy_methods – List of methods names that are available.
ScipyOptimizer.maxeval_kwarg – List of methods names that support maxeval argument.
- Parameters
method (str) – name of method from
scipy.optimize.minimize()
.log_transform (bool) – If True then values rae optimized in log scale.
maximize (bool) – If True then maximize target function.
-
get_addit_scipy_kwargs
(variables)¶
-
maxeval_kwarg
= {}¶
-
opt_type
= ''¶
-
optimize
(f, variables, x0, args=(), options={}, linear_constrain=None, maxiter=None, maxeval=None, verbose=0, callback=None, eval_file=None, report_file=None, save_file=None, restore_file=None, restore_points_only=False, restore_x_transform=None)¶ Run Scipy optimization.
- Parameters
f (func) – Target function to optimize.
variables (
gadma.utils.VariablePool
) – Variables of f which values should be optimized.x0 (list) – Initial point to start optimization.
args (tuple) – Additional arguments of target function.
options (dict) – Additional options kwargs for scipy optimization.
maxiter (int) – Maximum number of iterations to run.
maxeval (int) – Maximum number of evaluations to run. If None then run until converge.
verbose (int) – Verbosity of the output. If 0 then no reports.
callback (function) – Callback to run after each iteration of optimization. Should be called as callback(x, y)
report_file (str) – File to save report. Check option verbose.
eval_file (str) – File to save all evaluations of the function f.
save_file (str) – File to save information during optimization for its reconstruction.
restore_file (str) – File to restore previous run.
restore_points_only (bool) – Restore point/points from previous run and run optimization from them once more. If False then previous run will be resumed.
restore_x_transform (function) – Restore points but transform them before usage in this run.
-
prepare_callback
(f, callback, save_file=None)¶ Wraps callback function for usage inside optimizer. It transforms x according to log_transform and multiply y to -1 if maximize is True.
- Parameters
callback – function of callback that has the following notation: callback(x, y), where x is value of parameters and y is the value of target function on x.
-
save
(x_best, y_best, n_iter, n_eval, is_finished, save_file)¶ Save current achievement in optimization. Dumps x_best, y_best and is_finished to save_file using
pickle
.- Parameters
x_best – Values of best configuration.
y_best – Value of target function on x_best.
is_finished – If True then optimization was finished.
save_file – Filename to save data.
-
scipy_methods
= []¶
-
valid_restore_file
(save_file)¶ Checks that save_file contains valid information and it could be restored from it.
- Parameters
save_file – File to check.
-
class
gadma.optimizers.local_optimizer.
ScipyUnconstrOptimizer
(method, log_transform=False, maximize=False)¶ Bases:
gadma.optimizers.local_optimizer.ScipyOptimizer
,gadma.optimizers.optimizer.UnconstrainedOptimizer
Base class for Scipy unconstrained optimizations.
-
get_addit_scipy_kwargs
(variables)¶
-
maxeval_kwarg
= {'COBYLA': 'maxiter', 'Nelder-Mead': 'maxfun', 'Powell': 'maxfev'}¶
-
opt_type
= 'unconstrained'¶
-
scipy_methods
= ['Nelder-Mead', 'Powell', 'CG', 'BFGS', 'Newton-CG', 'COBYLA', 'trust-constr', 'dogleg', 'trust-ncg', 'trust-exact', 'trust-krylov']¶
-
-
gadma.optimizers.local_optimizer.
all_local_optimizers
()¶ Returns an iterator over all registered local optimizers.
-
gadma.optimizers.local_optimizer.
get_local_optimizer
(id)¶ Returns the local optimizer with the specified id.
- Parameters
id – ID of local optimizer.
-
gadma.optimizers.local_optimizer.
register_local_optimizer
(id, optimizer)¶ Registers the specified local optimizer.
- Parameters
id – ID of local optimizer to register.
optimizer (
gadma.optimizers.LocalOptimizer
) – Optimizer to register.
Combinations of optimizers¶
Module gadma.optimizers.combinations
contains classes of optimizers that are combinations of other optimizers.
-
class
gadma.optimizers.combinations.
GlobalOptimizerAndLocalOptimizer
(global_optimizer, local_optimizer)¶ Bases:
gadma.optimizers.global_optimizer.GlobalOptimizer
,gadma.optimizers.optimizer.ConstrainedOptimizer
Class for run of global optimizer followed by local optimizer. It is classified as global optimizer with constrains. Function
optimize()
got function and variables, runs global optimizer then filter discrete variables out and optimize the rest with local optimizer.- Parameters
global_optimizer (
gadma.optimizers.GlobalOptimizer
) – Global optimizer to use inside.local_optimizer (
gadma.optimizers.LocalOptimizer
) – Local optimizer to use inside.
-
optimize
(f, variables, args=(), global_num_init=50, X_init=None, Y_init=None, local_options={}, linear_constrain=None, global_maxiter=None, local_maxiter=None, global_maxeval=None, local_maxeval=None, verbose=0, callback=None, eval_file=None, report_file=None, save_file=None, restore_file=None, restore_points_only=False, global_x_transform=None, local_x_transform=None)¶ - Parameters
f (func) – Objective function.
variables (list of class:gadma.utils.VariablesPool) – List of objective function variables.
args (tuple) – Arguments of f.
global_num_init (int) – Number of initial points for global optimizer.
X_init (list) – List of initial vectors.
Y_init (list) – List of values of target function on points of X_init.
local_options (dict) – Options for local optimizer.
linear_constrain (
gadma.optimizers.LinearConstrain
) – Linear constrain on variables.global_maxiter (int) – Maximum number of global optimizer iterations to run.
global_maxeval (int) – Maximum number of function evaluation during global optimization.
local_maxiter (int) – Maximum number of local optimizer iterations to run.
local_maxeval (int) – Maximum number of function evaluation during local optimization.
verbose (int) – Varbosity of reports. If 0 then no output.
callback (func) – callback to run after each iteration of both optimizers.
eval_file (str) – File to save of objective function evaluations.
report_file (str) – File to save report each verbose iteration. If None and verbose > 0 then report will be printed to stdout.
save_file (str) – File to save information during the run.
restore_file (str) – File to restore previous run that was saved by
save()
method.restore_points_only (bool) – Restore run last results and run again from it.
global_x_transform (func) – Transformation of vectors after restore before run of global optimizer.
local_x_transform (bool) – Transformation of vectors after restore before run of local optimizer.
Optimizer result¶
-
class
gadma.optimizers.optimizer_result.
OptimizerResult
(x, y, success: bool, status: int, message: str, X, Y, n_eval: int, n_iter: int, X_out=[], Y_out=[])¶ Bases:
object
Class for keeping optimizers result. It is based on SciPy.optimize.OptimizeResult but have more information.
- Parameters
x – The solution of the optimization. The best value during run.
y – The value of objective function on x.
success (bool) – Whether or not the optimizer exited successfully.
status (int) – Termination status of the optimizer. Its value depends on the underlying solver. Refer to message for details.
message (str) – Description of the cause of the termination.
X – All solutions that were used in run.
Y – Values of objective function on X.
n_eval (int) – Number of evaluations of the objective functions performed by the optimizer.
n_iter (int) – Number of iterations performed by the optimizer.
X_out – Solutions that optimizer decides to show as important at the end. For example, it could be solutions of the last iteration. The most usefull usage is to restart optimizer or run new optimizer with passing it via X_init.
Y_out – Values of objective function on X_out.
-
static
from_GPyOpt_OptimizerResult
(gpyopt_obj)¶ Create OptimizerResult from instance of bayesian optimization.
- Parameters
gpyopt_obj (GPyOpt.methods.BayesianOptimization) – Object of GPyOpt optimizer
-
static
from_SciPy_OptimizeResult
(scipy_result: scipy.optimize.optimize.OptimizeResult)¶ Create OptimizerResult from instance of SciPy.optimize.OptimizeResult. Please, note that some attributes will be empty.
Linear constrain¶
-
class
gadma.optimizers.linear_constrain.
LinearConstrain
(A, lb, ub)¶ Bases:
object
Class containing linear constrain. lb <= A*x <= ub
- Parameters
A – Matrix.
lb – Lower bound.
ub – Upper bound.
-
property
A
¶
-
fits
(x)¶ Checks that x is good for constrain.
- Parameters
x – Vector to check that lb <= A*x <=ub.
-
property
lb
¶
-
try_to_transform
(x)¶ Try to transform x to fit the constrain. Current implementation go through each pair of bounds and if x multiplied by the corresponding column of A does not fit bounds then values of x that take part in this constrain are changed so that they fit the bounds.
- Parameters
x – Vector to change.
- Returns
transformed x and bool if transformation was successful.
-
property
ub
¶
-
gadma.optimizers.linear_constrain.
my_dot
(A_i, x)¶ Implemented easy version of multiplication of two vectors (row and column).