commons-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Konstantin Berlin <>
Subject Re: [Math] Cleaning up the curve fitters
Date Thu, 18 Jul 2013 13:50:48 GMT

Bringing up points with you never leads anywhere since you refuse accept that your proposal
might have issues, and as a last result you always bring out the "Examples Benchmarks" red
herring, since it has nothing to do with the logical problems of your proposal. Since you
work on commons projects, why don't you present benchmarks that say I am wrong, since clearly
I presented why theoretically what you want is problematic.

You have done a pretty good job in making the optimization package non-sensical to user. We
had a long discussion about this with pretty much everyone complaining but you. But for some
reason you decided unilaterally that your opinion about package organization is correct.

If you think your GW method is terrible and it never works better LV why do you have it in
the library? To trip up users?

You proposed a change. I just gave you about several reasons why it shouldn't be done without
further thinking, yet somehow you feel that it is not enough. I guess you will do whatever
you wanted to do anyways, but I would hope you listen to what I just said.

On Jul 18, 2013, at 9:34 AM, Gilles <> wrote:

> On Thu, 18 Jul 2013 09:16:46 -0400, Konstantin Berlin wrote:
>> Hi,
>> I have two points on this
>> 1) See issue  MATH-1009
> This is not directly related to my question (about cleanup of
> _existing_ code); it should thus be discussed in another thread.
>> 2) If LM was always better there would be no GuassNewton. Clearly
>> this is not the case.
>> LM is a mixture between GN and steepest descent, so it is only faster
>> for "tougher" functions. In case of strictly convex function GN should
>> be a good amount faster.
> Examples?
> Benchmarks?
> [Currently there aren't any unit tests showing the advantage for
> "GaussNewtonOptimizer". Contributions to this effect are most welcome.]
>> So the correct method depends on the problem.
>> Clearly for some of the fitters you know if the problem is well
>> behaved, so they should use GN, for the general method you cannot say.
> If we know in advance the best method for a given curve fitter (among those
> instantiated in package "o.a.c.m.fitting", then we could maybe "parameterize"
> the optimizer to be used (e.g. through an abstract method "createOptimizer()".
> It would still be transparent to the users.
>> I think you can easily benchmark if this is the case.
> I don't understand what this means.
> Gilles
>> On Jul 17, 2013, at 11:16 AM, Gilles <> wrote:
>>> Hello.
>>> Constructors of classes in the "o.a.c.m.fitting" are instantiated using
>>> an (abstract) "MultivariateVectorOptimizer". The only concrete
>>> implementations of this class are
>>> * LevenbergMarquardtOptimizer
>>> * GaussNewtonOptimizer
>>> [I.e. the API suggests that the Jacobian is not necessary for
>>> some optimizers but no such (vector) optimizer exists currently.
>>> Anyways the Jacobian is computed automatically (in inner class
>>> "CurveFitter.TheoreticalValuesFunction") so that an optimizer
>>> without derivatives is never necessary...]
>>> Observing that
>>> 1. almost all the unit tests for the fitters use an instance of
>>>   "LevenbergMarquardtOptimizer",
>>> 2. some comments in the "GaussNewtonOptimizerTest" unit test class makes
>>>   one wonder when "GaussNewtonOptimizer" should actually be preferred
>>>   over "LevenbergMarquardtOptimizer",
>>> I would propose to deprecate the non-default constructors in all fitter
>>> classes (and let the fitting be transparently performed with an instance
>>> of "LevenbergMarquardtOptimizer").
>>> Any objection?
>>> Regards,
>>> Gilles
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message