commons-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Gilles (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (MATH-874) New API for optimizers
Date Wed, 24 Oct 2012 14:38:13 GMT

    [ https://issues.apache.org/jira/browse/MATH-874?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13483278#comment-13483278
] 

Gilles commented on MATH-874:
-----------------------------

My question was related to what happens before any specifics of the optimizer code (base class
included).

In "FunctionUtils", you created a converter:
{code}
public static UnivariateDifferentiableFunction toUnivariateDifferential(final DifferentiableUnivariateFunction
f) {
  // ...
}
{code}

(With the nice consequence that old user code can be transparently transformed (in CM) and
passed to the new API.)
I was asking whether the same kind of converter could be written for the vector function (from
the type used in the old, now deprecated, optimizer API to the new type):
{code}
public static MultivariateDifferentiableVectorFunction toMultivariateDifferentiableVectorFunction(DifferentiableMultivariateVectorFunction
f) {
  // ???
}
{code}

With this converter, the deprecated method in "AbstractLeastSquaresOptimizer" (lines 302-322):

{code}
@Deprecated
public PointVectorValuePair optimize(int maxEval,
                                     final DifferentiableMultivariateVectorFunction f,
                                     final double[] target, final double[] weights,
                                     final double[] startPoint) {
    // Reset counter.
    jacobianEvaluations = 0;

    // Store least squares problem characteristics.
    jF = f.jacobian();

    // Arrays shared with the other private methods.
    point = startPoint.clone();
    rows = target.length;
    cols = point.length;

    weightedResidualJacobian = new double[rows][cols];
    this.weightedResiduals = new double[rows];

    cost = Double.POSITIVE_INFINITY;

    return optimizeInternal(maxEval, f, target, weights, startPoint);
}
{code}

can be transformed into

{code}
@Deprecated
public PointVectorValuePair optimize(int maxEval,
                                     final DifferentiableMultivariateVectorFunction f,
                                     final double[] target, final double[] weights,
                                     final double[] startPoint) {
  return optimize(maxEval,
                  FunctionUtils.toMultivariateDifferentiableVectorFunction(f),
                  target,
                  weights,
                  startPoint);
}
{code}

By which I mean that old user code will automatically use the new {{DerivativeStructure}}
API.


                
> New API for optimizers
> ----------------------
>
>                 Key: MATH-874
>                 URL: https://issues.apache.org/jira/browse/MATH-874
>             Project: Commons Math
>          Issue Type: Improvement
>    Affects Versions: 3.0
>            Reporter: Gilles
>            Assignee: Gilles
>            Priority: Minor
>              Labels: api-change
>             Fix For: 3.1, 4.0
>
>         Attachments: optimizers.patch
>
>
> I suggest to change the signatures of the "optimize" methods in
> * {{UnivariateOptimizer}}
> * {{MultivariateOptimizer}}
> * {{MultivariateDifferentiableOptimizer}}
> * {{MultivariateDifferentiableVectorOptimizer}}
> * {{BaseMultivariateSimpleBoundsOptimizer}}
> Currently, the arguments are
> * the allowed number of evaluations of the objective function
> * the objective function
> * the type of optimization (minimize or maximize)
> * the initial guess
> * optionally, the lower and upper bounds
> A marker interface:
> {code}
> public interface OptimizationData {}
> {code}
> would in effect be implemented by all input data so that the signature would become (for
{{MultivariateOptimizer}}):
> {code}
> public PointValuePair optimize(MultivariateFunction f,
>                                OptimizationData... optData);
> {code}
> A [thread|http://markmail.org/message/fbmqrbf2t5pb5br5] was started on the "dev" ML.
> Initially, this proposal aimed at avoiding to call some optimizer-specific methods. An
example is the "setSimplex" method in "o.a.c.m.optimization.direct.SimplexOptimizer": it must
be called before the call to "optimize". Not only this departs form the common API, but the
definition of the simplex also fixes the dimension of the problem; hence it would be more
natural to pass it together with the other parameters (i.e. in "optimize") that are also dimension-dependent
(initial guess, bounds).
> Eventually, the API will be simpler: users will
> # construct an optimizer (passing dimension-independent parameters at construction),
> # call "optimize" (passing any dimension-dependent parameters).

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message