Hi Ted,
This is in the worst case. I haved asked for the amount of data and that's what somebody
told me. Anyway I'm not going to make any test with that amount of data.
I'm doing my final year project and I just have to make this works with a reasonable amount
of data. So I just need to know how the algorithm it's working and why I'm getting different
values when I compare the results with another implementation in R.

> From: ted.dunning@gmail.com
> Date: Tue, 23 Nov 2010 14:55:00 0800
> Subject: Re: Lanczos Algorithm
> To: user@mahout.apache.org
>
> I seriously doubt that, actually. 10^14 is a very large number.
>
> As far as I know, the record for computing an SVD of a large sparse matrix
> started with about 23 x 10^9 nonzero elements. You are saying that your
> problem is 100,000 times larger than this. I think that you are going to
> have to
> wait for another 15 compute speed doubling times before this becomes a
> feasible computation.
>
> On Tue, Nov 23, 2010 at 11:55 AM, PEDRO MANUEL JIMENEZ RODRIGUEZ <
> pmjimenez1983@hotmail.com> wrote:
>
> > Well, this is in the worst case but it could be possible.
> >
> > I'm not going to make any tests with this amount of data because for me is
> > impossible but this project is part of a bigger one and they would have
> > enough space to deal with this amount of data.
> >
> >
> > 
> > > From: ted.dunning@gmail.com
> > > Date: Mon, 22 Nov 2010 14:46:20 0800
> > > Subject: Re: Lanczos Algorithm
> > > To: user@mahout.apache.org
> > >
> > > That seems like a lot. That would mean that have 10^14 = 100 trillion
> > > nonzero elements which would take 10PB to store with one bit per nonzero
> > > element.
> > >
> > > Are there many totally zero rows?
> > >
> > > Can you estimate how many nonzero elements you have in all?
> > >
> >
