mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hector Yee <hector....@gmail.com>
Subject Re: AdaBoost
Date Wed, 01 Jun 2011 01:35:59 GMT
Wojciech, I've opened a ticked you can watch

 https://issues.apache.org/jira/browse/MAHOUT-716

I should have the in core code ready in ~3 days. The gradient portion is
easily parallelizable if you want to implement it as mapreduce.

On Tue, May 24, 2011 at 1:57 PM, Wojciech Indyk <wojciechindyk@gmail.com>wrote:

> Hi!
> I want implement AdaBoost in Mahout. Could it be useful in Mahout? I
> think so, because it's strong algorithm and very powerful, but Mahout
> is specific, so who knows :)
> I thought about training data and I know, that I must parallelize by
> data, rather than by algorithms, so it will be not so easy - I must
> run all mapers of chosed algorithms in my training maper, but I have
> no idea how could i choose algorithms to adaboost (in architecture
> way) like a parameter.
>
> Regards
>



-- 
Yee Yang Li Hector
http://hectorgon.blogspot.com/ (tech + travel)
http://hectorgon.com (book reviews)

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message