mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robin Anil <robin.a...@gmail.com>
Subject Re: Making Very Large-Scale Linear Algebraic Computations Possible Via Randomization
Date Fri, 25 Sep 2009 19:10:34 GMT
You dont have to ask.  Please go ahead file a JIRA issue, and start working
on it.
http://issues.apache.org/jira/browse/MAHOUT

Robin


On Sat, Sep 26, 2009 at 12:33 AM, Jake Mannix <jake.mannix@gmail.com> wrote:

>  Those look very cool, I'd love to see how those compare with doing
> plain-old Lanczos for SVD on Hadoop.  Speaking of which, I've got an
> implementation of that which I wrote up for my own matrix library (
> http://decomposer.googlecode.com ) a while back, and I noticed that we
> still
> don't have any large-scale SVD impls in Mahout.  Is there any interest by
> the community for me to try and port that / contribute this to Mahout?
>  It's
> Apache-licensed, but I'm currently using mostly my own sparse and dense
> vector writables for use on Hadoop (designed specifically for things like
> Lanczos and AGHA), so I'd need to port them over to use whichever vector
> impls Mahout is using.
>
>  -jake
>
> On Fri, Sep 25, 2009 at 11:25 AM, Ted Dunning <ted.dunning@gmail.com>
> wrote:
>
> > Isabel,
> >
> > Very interesting post.  Here are more accessible resources:
> >
> > http://arxiv.org/abs/0909.4061
> > http://www.pnas.org/content/104/51/20167
> >
> > THese provide a very interesting and solid link between random indexing
> and
> > SVD algorithms.  They also definitely provide a fantastic way to
> implement
> > large scale SVD using map-reduce.
> >
> > Nice pointer!
> >
> > 2009/9/25 Michael Br├╝ckner <mibrueck@cs.uni-potsdam.de>
> >
> > >
> > > year's NIPS (http://nips.cc/Conferences/2009/Program/event.php?ID=1491
> )
> >
> >
> >
> >
> > --
> > Ted Dunning, CTO
> > DeepDyve
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message