OECheckpoint1

Attention

This API is currently available in C++ and Python.

class OECheckpoint1 : public OECheckpoint0

The OECheckpoint1 provides a “callback” facility to monitor the progress of an optimization. In addition, it can also be used to control the termination of an optimization. This class may be used with optimizers that do not use gradients as well as those that use gradients

The following methods are publicly inherited from OECheckpoint0:

The OECheckpoint1 class defines the following public methods:

operator()

bool operator()(unsigned int iteration, unsigned int nvar, double fval,
                double gnorm, const double *var=(double *) 0,
                const double *grad=(double *) 0, unsigned int state=0)=0

This operator method is called by an optimizer at least once per iteration during an optimization. The variables passed to the method from the optimizer reflect the current state of an optimization.

iteration

The iteration number.

nvar

The number of variables.

fval

The most recent function evaluation during the current iteration.

gnorm

The most recent gradient norm evaluation during the current iteration. The gradient norm is defined as \(\sqrt{\sum_i {g_ig_i}}\)

var

Array containing current set of optimized variables. If available, the array has a length of nvar.

grad

Array containing current set of optimized variables. If available, the array has a length of nvar.

state

The current state of optimization.