commons-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Konstantin Berlin <>
Subject Re: [Math] Old to new API ("MultivariateDifferentiable(Vector)Function")
Date Fri, 30 Nov 2012 16:33:54 GMT
As a user of the optimization algorithms I am completely confused by the change. It seems different
from how optimization function are typically used and seems to be creating a barrier for no

I am not clear why you can't just leave the standard interface to an optimizer be a function
that computes the value and the Jacobian (in case of least-squares), the gradient (for quasi-Newton
methods) and if you actually have a full newton method, also the Hessian.

If the user wants to compute the Jacobian (gradient) using finite differences they can do
it themselves, or wrap it into a class that you can provide them that will compute finite
differences using the desired algorithm.

Also I can image a case when computation of the Jacobian can be sped up if the function value
is known, yet if you have two separate functions handle the derivatives and the actual function
value. For example f^2(x). You can probably derive some kind of caching scheme, but still.

Maybe I am missing something, but I spend about an hour trying to figure out how change my
code to adapt to your new framework. Still haven't figured it out.

On Nov 30, 2012, at 11:11 AM, Gilles Sadowski <> wrote:

> Hello.
> Context:
> 1. A user application computes the Jacobian of a multivariate vector
>    function (the output of a simulation) using finite differences.
> 2. The covariance matrix is obtained from "AbstractLeastSquaresOptimizer".
>    In the new API, the Jacobian is supposed to be "automatically" computed
>    from the "MultivariateDifferentiableVectorFunction" objective function.
> 3. The converter from "DifferentiableMultivariateVectorFunction" to
>    "MultivariateDifferentiableVectorFunction" (in "FunctionUtils") is
>    deprecated.
> 4. A "FiniteDifferencesDifferentiator" operator currently exists but only
>    for univariate functions.
>    Unles I'm mistaken, a direct extension to multiple variables won't do:
>     * because the implementation uses the symmetric formula, but in some
>       cases (bounded parameter range), it will fail, and
>     * because of the floating point representation of real values, the
>       delta for sampling the function should depend on the magnitude of
>       of the parameter value around which the sampling is done whereas the
>       "stepSize" is constant in the implementation.
> Questions:
> 1. Shouldn't we keep the converters so that users can keep their "home-made"
>    (first-order) derivative computations?
>    [Converters exist for gradient of "DifferentiableMultivariateFunction"
>    and Jacobian of "DifferentiableMultivariateVectorFunction".]
> 2. Is it worth creating the multivariate equivalent of the univariate
>    "FiniteDifferencesDifferentiator", assuming that higher orders will
>    rarely be used because of
>     * the loss of accuracy (as stated in the doc), and/or
>     * the sometimes prohibitively expensive number of evaluations of the
>       objective function? [1]
> 3. As current CM optimization algorithms need only the gradient or
>    Jacobian, would it be sufficient to only provide a limited (two-points
>    first-order) finite differences operator (with the possiblity to choose
>    either "symmetric", "forward" or "backward" formula for each parameter)?
> Best regards,
> Gilles
> [1] And this cost is somewhat "hidden" (as the "DerivativeStructure" is
>    supposed to provide the derivatives for free, which is not true in this
>    case).
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message