mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Dunning <ted.dunn...@gmail.com>
Subject Re: Is Cholesky too sensitive to rank deficiency?
Date Mon, 17 Mar 2014 20:39:50 GMT

This may not be an issue that can actually be cured. The cholesky trick is akin to squaring
a number.  Inherently you tend to lose precision by doing this.  

With the possibility of iteration we should consider more advanced methods for large qr. The
great value of the cholesky trick is that one can use map reduce with no iteration.  

Sent from my iPhone

> On Mar 17, 2014, at 11:31, Dmitriy Lyubimov <dlieu.7@gmail.com> wrote:
> 
> I still seem to get signficant differences on the norm differences of
> Householder QR and QR via Cholesky trick. our stock in-core QR seems to be
> comfortable populating some R values (and therefore Q columns) with values
> as small as 1e-16, whereas Cholesky computation for L seems to set these
> things to 0. Norms on Q in this case differ more than trivially.
> 
> Are we sure we cannot decrease sensitivity of Cholesky decomposition to
> small values? I have manipulated the limit there that controls the decision
> for positive-definite-ness but I am not sure i understand algorithm well
> enough to do a meaningful sensitivity reduction.

Mime
View raw message