commons-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Phil Steitz <>
Subject Re: [math] API proposal for differentiation
Date Sun, 22 Jul 2012 19:44:07 GMT
On 7/22/12 9:35 AM, Luc Maisonobe wrote:
> Hi all,
> A few months ago, Gilles proposed to add an implementation of the
> Ridder's algorithm for differentiation (see
> <>). After some discussion
> the issue was postponed, waiting for an API for this family of algorithms.
> Last week, at work, someone asked me again about differentiation and I
> directed him to this issue and also to the Nabla project. He told me he
> was interested in having some support in [math] for this (I'm not sure
> he reads this list, but if he does, I hope he will join the discussion).
> The basic needs are to get derivatives of either real-valued or vector
> valued variables, either univariate or multivariate. Depending on the
> dimensions we have derivatives, gradients, or Jacobians. In some cases,
> the derivatives can be computed by exact methods (either because user
> developed the code or using an automated tool like Nabla). In other
> cases we have to rely on approximate algorithms (like Ridder's algorithm
> with iterations or n-points finite differences).
> Most of the times, when a derivative is evaluated, either the user needs
> to evaluate the function too or the function is evaluated as a side
> effect of the derivative computation. Numerous tools therefore packe the
> value and the derivative in one object (sometimes called Rall numbers)
> and transfor something like:
>   double f(double x) { ... }
> into this:
>   RallNumber f(RallNumber x) { ... }
> This is a very classical approach, very simple for languages that
> implement operator overloading (see the book "Evaluating Derivatives" by
> Andreas Griewank and Andrea Waltherpaper, SIAM 2008 or the paper "On the
> Implementation of Automatic Differentiation Tools" by Christian H.
> Bischof, Paul D. Hovland and Boyana Norris, unfortunately I was not able
> to find a public version of this paper, or simply look at the wikipedia
> entry <> and the
> presentation on automatic differentiation which seems to have been
> written by one of the people who contributed to the wikipedia article
> <>).
> Using this approach is simple to understand and compatible with all
> differnetiation methods: direct user code for functions that are easily
> differentiated, generated exact code produced by tools, finite
> differences methods without any iteration and iterative methods like the
> Ridder's method.
> So I suggest we use this as our basic block. We already have an
> implementation of RallNumbers in Nabla, where it is called
> differentialPair
> (<>).
> We can move this class into [math] and rename it RallNumbers.

+1 - though I don't really see the need or big value in renaming
this.  DifferentialPair is more descriptive.

> For real valued univariate functions, we have the interface
> UnivariateFunction
> <>
> that defines the single method
>   double value(double x)
> This interface is used in many places, it is simple and it is well
> defined, so it should be kept as is.
> We also have a DifferentiableUnivariateFunction, which returns a
> separate UnivariateFunction for the derivative:
>  UnivariateFunction derivative();
> This interface is used only in the interface
> AbstractDifferentiableUnivariateSolver and in its single implementation
> NewtonSolver. This interface seems awkward to me and costly as it forces
> to evaluate separately the function even when evaluating the derivative
> implies computing the function a few times. For very computing intensive
> functions, this is a major problem.
> I suggest we deprecate this interface and add a new one defining method:
>  RallNumber value(RallNumber x)
> This interface could extend UnivariateFunction in which case when we
> want to compute only the value we use the same object, or it could also
> define a getPrimitive method the would return a separate primitive
> object (this is how it is done in Nabla, with the interface
> UnivariateDerivative
> (<>),
> or it can even be completely separate from UnivariateFunction.

I would say leave UnivariateFunction as is (no need to force it into
a hierarchy) and just add DifferentialPair, including getPrimitive -
basically stick with the [nabla] design.  I am +1 for deprecating
> Multivariate differentiable functions or vector valued functions would
> be done exactly the same way: replacing the parameter or return value
> types from double arrays to RallNumber arrays. This would be much more
> consistent than what we currently have (we have methods called
> partialDerivatives, gradients, jacobians and even several methods called
> derivative which do not have the same signatures).
> For functions that can be differentiated directly at development time,
> this layer would be sufficient. We can typically us it for all the
> function objects we provide (Sin Exp ...) as well as the operators
> (Multiply, subtract ...).
> For more complex functions, we need to add another layer that would take
> a simple function (say a UnivariateFunction for the real value
> univariate case) and would create the differentiated version. For this,
> I suggest to create a Differentiator interface (similar in spirit to
> Nable UnivariateDiffernetiator
> <>).
> We should probably build a set of interfaces in the same spirit as the
> solvers interfaces. This interface would be implemented by engines that
> can compute derivatives. In most cases, these engines would simply
> return a wrapper around the function they get as parameters. The wrapper
> would implement the appropriate derivative interface. When the wrapper
> "value" method would be called, the underlying raw univariate "value"
> method would be called a number of time to compute the derivative.
> We could provide several implementations of these interfaces. For
> univariate real functions, we should at least provide the finite
> differences methods in 2, 4, 6 and 8 points (we already have them in
> Nabla), and we should also provide the Ridder's algorithm. Users or
> upper level libraries and applications could provide other
> implementations as well. In fact, Nabla would be such a case and would
> provide an algorithmic implementaion and would depend on [math].
> We could also provide implementations for the upper dimensions
> (multivariate or vector valued) using simply the univariate
> implementations underneath (typically computing a gradient involves
> computing n derivatives for the n parameters of the same function,
> considering each time a different parameter as the free variable). Here
> again, more efficient implementations could be added for specific cases
> by users.
> The separation between the low level layer (derivatives definition) and
> the upper layer (transforming a function that do not compute derivatives
> into a function taht compute derivatives) is a major point. The guy who
> talked to me last week insisted about this, as he wanted to avoid
> approximate (and computation-intensive) algorithms when he can, and he
> wanted to have these algorithms (like Ridder) available when he cannot
> do without them. So the focus is not to have Ridder by itself, but
> rather to compute derivatives, using Ridder or something else.
> Of course, I can provide all the raw interfaces and finite differences
> implementations. If someone provides me the paper for Ridder's method, I
> can also implement this (or if someone wants to do it once the interface
> have been set up, it is OK too).
> What do you think about this ?

All of this sounds reasonable.  The machinery in [nabla] for
building DifferentialPairs recursively should probably also be
brought in.

> Luc
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message