commons-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Luc Maisonobe <>
Subject Re: [math] Usage of DifferentiableMultivariateRealFunction
Date Mon, 14 May 2012 13:42:47 GMT
Le 14/05/2012 14:11, Andreas Niekler a écrit :
> Hello,

Hi Andreas,

> after reading a lot through the tutorial this is the code that i came up
> with regarding the implementation of a gaussian process regression
> optimisation (File appended):
> initCovarianceAndGradients(): initialisation of matrices and
> calculations which are needed by both marginal likelihood calculation
> and gradient calculation:
> Within this function i calculate some things globally which are strongly
> reused by the value() and gradient() functions. What i do not really
> understand is the passing of the double[] argument to the value()
> function and the value() function of the gradient() method. Are those
> methods called by the optimizer with the updated parameters? If this is
> the case i have to recalculate the global calculations with each call to
> the value() and gradient() methods.

Yes, the double[] argument is updated at each call and correspond to the
current estimate as the algorithm iterates towards the solution.

You cannot even rely on the calls being always scheduled in the same
way. As an example, the Gauss-Newton optimizer performs the two calls
with function first and gradient afterwards at each iteration, but the
Levenberg-Marquardt optimizer has two embedded loops and computes
Jacobians on the external loop and the function value on the internal
loop. So you should probably not compute everything beforehand in the
hope it will be used later on.


> Thanks for clarification
> Am 14.05.2012 12:53, schrieb Gilles Sadowski:
>> Hello.
>>>> thanks for the reply. But i wonder what is the input for value and
>>>> gradient.
>>>> in DifferentiableMultivariateRealFunction this needs to be a double
>>>> array
>>>> but what needs to be provided there? The parameters for the function to
>>>> optimize?
>>>> Thank you very much again
>>>> Andreas
>>> Do please have a look to the examples, as your question (and my
>>> answer) is too vague if not supported by proper code. I guess the
>>> answer to your question is 'yes', the double[] array is indeed the set
>>> of parameters, but again, do check the examples, I would not like to
>>> be misguiding you. Besides the user guide which should provide you
>>> with the answer, have a look to this implementation [1], line 153. In
>>> this implementation, x[i] and y[i] are the data points, yhat[i] are
>>> the model predictions, and a[] are the parameters. You should be able
>>> to find your way with this example.
>> I've also just added another bit of code show-casing the usage of the
>> "non-linear least-squares" optimizers (svn revision 1338144).
>> Best regards,
>> Gilles
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail:
>> For additional commands, e-mail:
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message