commons-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mikkel Meyer Andersen <>
Subject Re: [math] Optimization: Nelder-Mead and Levenberg-Marquardt
Date Wed, 23 Nov 2011 11:22:59 GMT
2011/11/23 annaykay <>:
> Hello everyone,
> I am working on the optimization of some model parameters in my simulation,
> which simulates the impact of communication on the attitude towards a
> specific topic.
> I want to optimize 17 parameters in a non-linear function to get a minimal
> error-value. Therefor I implemented two different Optimization-Algorithms in
> my Simulation.
> First I tried the Nelder-Mead Algorithm. But in this case I have the
> problem, that first my error-value increases, then decreases again and
> starts to stagnate on a non-satisfying value. Is this even possible for the
> Nelder-Mead method, that the error increases however I want to minimize it?
> Then I also tried a different Optimization-Algorithm, the
> Levenberg-Marquardt-Algorithm. Here the problem is that the changes in the
> parameters are too small, so that the optimization already stops after one
> iteration.
> Do you maybe have an idea about approaching this problem or do you know a
> different Optimization-Algorithm that could suit my problem?
> Thanks in advance!
> --
> View this message in context:
> Sent from the Commons - User mailing list archive at
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:


If the function you want to optimize has several local maxima, then it
is (almost) always problematic. Especially with 17 parameters, that is
a lot. Are you sure that you cannot obtain an analytical solution?
Have you tried different starting values for the Nelder-Mead

Cheers, Mikkel.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message