>>
>> My opinion is that the package should be organized by what it does rather, than how
it does it.
>
> My proposal is based on what the user wants to do and on what input is
> required in order to use the tools in the given package, where all
> algorithms will share the same interface.
>
I humbly disagree with your claim. The user does not want a vector nor a Jcobian. The user
has a problem, and they want to see their options for a solution. That is why they will go
into, for example, leastsquares. They will see that there are different kinds, so they will
go into linear, for example. There they will see all their options and what they need to make
it work.
I also have a problem with how you targeting the package to naive people who want to play
around with packages. You want the package to be easily accessible to people who require optimizations.
I would imagine most users have a good idea what kind of problem that they have are looking
for the optimizer that will solve their problem. By structuring the packages in the way I
describe, they first go through the general problem that they have, and then go down into
more specific detail with every level. That seems to me the most natural.
>> My thinking is
>>
>> optim
>> optim.scalar.
>> optim.scalar.linear
>> optim.scalar.socp (second order cone programming)'
>> optim.scalar.qcqp
>> optim.scalar.nonlinear
>> optim.scalar.nonlinear.derivfree
>> optim.scalar.nonlinear.derivfree.Powell, etc
>> optim.scalar.nonlinear.newton
>>
>> optim.scalar.univariate.*
>>
>> optim.leastsquares.linear
>> optim.leastsquares.nonlinear
>
> IMHO, the problem with the above is that it is targetted to optimization
> experts (who would know what all the abbreviations mean and what to look
> for).
> In the other approach, a nonexpert can go to a package, read the toplevel
> doc and start experimenting, knowing what to plug in. [I know, this is not
> a strong argument; but it lowers the barrier to entry. :)]
>
> Perhaps there are intersections between the two approaches.
>
>> But I am flexible. Perhaps it is worth a look here:
>> http://www.joptimizer.com/
>
> Thanks for the pointer.
>
> Gilles
>
>>
>>>
>>> Shall we also introduce entirely new packages?
>>>
>>> optim
>>>
>>> optim.scalar.noderiv
>>> optim.scalar.noderiv.PowellOptimizer
>>> optim.scalar.noderiv.SimplexOptimizer
>>> optim.scalar.noderiv.CMAESOptimizer
>>> optim.scalar.noderiv.BOBYQAOptimizer
>>>
>>> optim.scalar.gradient
>>> optim.scalar.gradient.NonLinearConjugateGradientOptimizer
>>>
>>> optim.vector
>>> optim.vector.jacobian
>>> optim.vector.jacobian.AbstractLeastSquaresOptimizer
>>> optim.vector.jacobian.LevenbergMarquardtOptimizer
>>> optim.vector.jacobian.GaussNewtonOptimizer
>>>
>>> optim.scalar.univariate.noderiv
>>> optim.scalar.univariate.noderiv.BrentOptimizer
>>>
>
> 
> To unsubscribe, email: devunsubscribe@commons.apache.org
> For additional commands, email: devhelp@commons.apache.org
>

To unsubscribe, email: devunsubscribe@commons.apache.org
For additional commands, email: devhelp@commons.apache.org
