thanks for your reply
my matrix is not very dense, a sparse matrix.
I have tried the svd of Mahout, but failed due to the OutOfMemory error.
Best
Wei
On Fri, Mar 25, 2011 at 2:03 PM, Dmitriy Lyubimov <dlieu.7@gmail.com> wrote:
> you can certainly try to write it out into a DRM (distributed row
> matrix) and run stochastic SVD on hadoop (off the trunk now). see
> MAHOUT-593. This is suitable if you have a good decay of singular
> values (but if you don't it probably just means you have so much noise
> that it masks the problem you are trying to solve in your data).
>
> Current committed solution is not most efficient yet, but it should be
> quite capable.
>
> If you do, let me know how it went.
>
> thanks.
> -d
>
> On Thu, Mar 24, 2011 at 10:59 PM, Dmitriy Lyubimov <dlieu.7@gmail.com>
> wrote:
> > Are you sure your matrix is dense?
> >
> > On Thu, Mar 24, 2011 at 9:59 PM, Wei Li <wei.lee04@gmail.com> wrote:
> >> Hi All:
> >>
> >> is it possible to compute the SVD factorization for a 600,000 *
> 600,000
> >> matrix using Mahout?
> >>
> >> I have got the OutOfMemory error when creating the DenseMatrix.
> >>
> >> Best
> >> Wei
> >>
> >
>
|