Hi Luc,
>
> I don't fully agree with this. Both numerical and analytical approach are
> useful and have advantages and drawbacks.
>
Wowww!
I certainly did not want to start a debate on this topic. I'm just
reporting on a conference I heard which lead me think that CM users
might find such a feature usefull. I think we can safely assume
Prevost knows what he is talking about, he has been around long enough
in the FE community. Having said that, I'm confident you also know
what you are talking about... So maybe you both are talking about
slightly different things. Prevost is differentiating a material
constitutive law: multivariate, yes, but not too many variables. So
maybe the time overhead is not so great at low dimensionality, you
tell me.
As for automatic differentiation: that topic was not raised in his
talk, and I'm sure he is using an external CAS software and
copy/pasting into his huge FORTRAN (ughhh) software.
Sébastien
2011/8/12 Luc Maisonobe <Luc.Maisonobe@free.fr>:
> Hi Sébastien,
>
> Le 12/08/2011 07:50, Sébastien Brisard a écrit :
>>
>> As Patrick suggested, this approach should really be extended to
>> multivariate functions. To cite but one example, I recently attended a
>> conf where Pr. Prevost (Princeton) talked about nonlinear finite
>> elements calcs. The long standing approach had always been to
>> implement the analytical expressions tangent stiffness (which is
>> nothing but a jacobian matrix). He argued strongly against it for at
>> least two reasons
>>  it is errorprone,
>>  most of the time, the expressions are so complex that their
>> evaluation is just as timeconsuming as a numerical derivative.
> The fact analytical approach is errorprone is true only when analytical
> differentiation is done manually. Using automatic differentiation completely
> removes this problem (take a look at Nabla).
>
> The fact expression are has time consuming as numerical derivatives is
> simply false when speaking about multivariate functions. This result is
> known as the "cheap gradient" property. The relative computing effort for
> gradients or Jacobians using finite differences for an n variables function
> with respect to the basic function evaluation is roughly 2n. Using the
> automatic differentiation technique known as "reverse mode" (which is not
> implemented in Nabla but should be in the unknown future), this cost is
> about 4 and is *independent of n*, which is really an amazing result.
>
>> So, having some robust algorithms for multidimensional functions
>> already implemented in CM would in my view be invaluable.
> Luc
>
>> Sébastien
>>
>> 
>> To unsubscribe, email: devunsubscribe@commons.apache.org
>> For additional commands, email: devhelp@commons.apache.org
>>
>>
>
>
> 
> To unsubscribe, email: devunsubscribe@commons.apache.org
> For additional commands, email: devhelp@commons.apache.org
>
>

To unsubscribe, email: devunsubscribe@commons.apache.org
For additional commands, email: devhelp@commons.apache.org
