commons-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gilles Sadowski <>
Subject Re: [Math] Old to new API ("MultivariateDifferentiable(Vector)Function")
Date Tue, 04 Dec 2012 15:15:09 GMT
On Tue, Dec 04, 2012 at 05:47:09AM -0500, Konstantin Berlin wrote:
> Hi,
> I think this is getting silly. What I am saying is not a matter of
> opinions but of textbook optimizations.  This is not a matter of use
> cases, but of something that is already well established.  I feel like
> this package is trying to reinvent the wheel, in a subject that is already
> well known, and very settled.
> I am not going to present myself as some expert in optimizations, however
> I am familiar with different optimization methods, why they exists, how
> they work and relate to each other, and how constraints (be they equality
> and inequality, bounds, linear constraints, and nonlinear constrains) are
> added to optimization.  Knowing what I know, I can't help but to feel it
> is important to have this perspective before trying to properly engineer
> an optimization package, even if you do not have most of these features
> implemented yet.

What you seem to fail to understand is what Commons Math is.
Currently it is a repository of math-oriented algorithms which users can
choose from in order to solve whatever problems which they are able to adapt
to the input required by those algorithms.

Illustrating with the "optimization" package: users are not directed to one
specific algorithm as _the_ solution to their problem; they are allowed to
choose any algorithm whose expected input matches what the user is able to
If I want to solve a non-linear least squares problem with a derivative-free
method, I can. [Even if I very much appreciate your advice about
derivative-based methods being provably faster than direct methods, and I
was indeed about to test Levenberg-Marquardt and see how it compares on our
function (a simulator's output).]

In that respect Commons Math thus very much differ from "JOptimizer" that
indeed took the problem-oriented approach, for the perfectly valid reason
that it aims at solving specific optimization problems, each in the best
known way.

I proposed the package layout under "optim" based on what input must be
provided to the algorithms under the given package also because it is
similar to what is done in other packages, i.e. a subdivision based on the
algorithms' (or data structures') similarities rather than on what end they
are supposed to be put to use.

More importantly, as I've indicated before, due to lack of human resources,
we are unable to develop whole sets of tools for the sole purpose of
inclusion into Commons Math (to my knowledge, and however interesting that
would be, nobody is paid to develop CM!).

> If you are curious about how to add constraints to optimization methods or
> optimizations in general, you can read some slides here

Thanks for the pointer.

> I have issues with the suggestions that I am presented with, since IMHO
> they are missing the big picture.  I feel like least-squares methods are
> presented here as though as they are an extension of optimization into
> multiple dimensions.  That is incorrect, and even if you know that, you
> are misleading the user.  Least-squares is a limited case of a general
> optimization problem.  This expression will get you an F in a math class:
> ValueAndGradient[] computeValueAndJacobian(double[] parameters);
> The vector value and the matrix Jacobian should be separated, not combined
> together.  They are not mathematically defined like this, nor are they
> used like this in optimizations, and would require extraction inside the
> optimization method.

How the optimization method will extract the data is an implementation
detail: Whether it accesses it as
  value4 = DiffValueLeastSquares.values[4];
  j40 = DiffValueLeastSquares.J[4][0];
  vAndG = ValueAndGradient[4];
  value4 = vAndG.value;
  j40 = vAndG.gradient[0];
is completely transparent.
I admit nevertheless than if the matrix is used as an "object" and the
subsequent operations involves matrix operations, it is indeed better to
handle it as such from the start.

I thus withdraw my proposal, and will stick to what Luc proposed e.g. the
"optimize" method in "AbstractLeastSquares" will be:
  PointVectorValuePair optimize(int maxEval,
                                MultivariateVectorFunction value,
                                MultivariateMatrixFunction jacobian,
                                OptimizationData... optData)

> This expression prevents the use of sparse matrices,
> if that would be desired in the future.

So does your "DiffValueLeastSquares"...

That may be a good point but without a patch, we cannot hope to get much
For example, to allow for sparse matrices we could have a new interface:
  public NewMultivariateMatrixFunction {
    RealMatrix value(double[] point);
[Whereas the existing "MultivariateMatrixFunction" returns a "double[][]".]

That's possible to implement now. Is it necessary? No because no one has a
use-case that demonstrates the advantage. As I said, when that data exists,
it will be taken into account.

> > Chi2 values are only used (and subsequently made accessible to the caller)
> > in the "AbstractLeastSquaresOptimizer" hierarchy.
> > In the current design, it would not make any sense to just change the return
> > value type and force every algorithm to fill in data they don't use.
> > 
> > Discussion is open but should be based on actual cases, or on patches that
> > demonstrate how the change is applied to all the implementations which are
> > supposed to be affected.
> This is a perfect example of why inherence was created.

That sentence does not clarify what you mean.

> You guys are free to proceed how you want. I am trying to save you guys
> the trouble of having to redesign your optimization package in the near
> future.

I'm sorry but that's not what we need because the redesign(s) will happen
nonetheless. And this is not to be construed as a failure, it's the way
for a programming project to evolve when new knowledge and new features are
integrated into the picture.

> I will be happy to comment on proposals, but I no longer have the time to
> argue about fundamentals of optimizations.  I do not mean to sound like
> this, but I am really out of personal time.  I hope you take a "users"
> view point seriously.

We do take users seriously. Having a look at the bug tracking system is
ample proof.

I'll sound much less politically correct than Luc, but unfortunately, you
seem to not take the developers' point-of-view (and time) very seriously:
You do not have the time to provide what we need (raising issues on the bug
tracking system, answering direct questions on this ML, providing use-cases,
benchmarks, patches), fine, but then do not expect that we are (always)
going to do it for you!


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message