Return-Path: X-Original-To: apmail-mahout-dev-archive@www.apache.org Delivered-To: apmail-mahout-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 134C610ED2 for ; Tue, 30 Apr 2013 14:27:22 +0000 (UTC) Received: (qmail 42171 invoked by uid 500); 30 Apr 2013 14:27:21 -0000 Delivered-To: apmail-mahout-dev-archive@mahout.apache.org Received: (qmail 42100 invoked by uid 500); 30 Apr 2013 14:27:21 -0000 Mailing-List: contact dev-help@mahout.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@mahout.apache.org Delivered-To: mailing list dev@mahout.apache.org Received: (qmail 42092 invoked by uid 99); 30 Apr 2013 14:27:21 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Apr 2013 14:27:21 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of reto.matter@gmail.com designates 209.85.160.54 as permitted sender) Received: from [209.85.160.54] (HELO mail-pb0-f54.google.com) (209.85.160.54) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Apr 2013 14:27:15 +0000 Received: by mail-pb0-f54.google.com with SMTP id jt11so288429pbb.13 for ; Tue, 30 Apr 2013 07:26:50 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=o136Fd1p9awWRDv2ALqh7kBQusESTJHb+BQjmJqXQxQ=; b=HTd/qDbuOGHSML0J+v+1XcxDCwhTtVBvcPA+AuQ0Jvu93EZjssr2UkQZgo3IfRK/s0 r5HmsHODfGHD2ddOGZMEB962e2gj1xV5o5T+V3Os2D1i3QK+wXiGT8+T8j+8tbYgxpUK +YGiP5XTcS0fuzzEPcvk4Httqi3pMvi2K5sD+vZWp6BrYtimkQd9Rp0eGfp+jmVpiMiz RQI2Vl8SGWOLUFtknVnbl8PNLRrShSgMycScp8SqQiUd1fu5Yfd0Gqx/tCNn6/3x9LF8 jlLGRunwUz67mjoLvUx2A0X3rwWMyMcCGEi31MUN/E/V+7OYeFoFeaK1dmvI3ox38wLp eRug== MIME-Version: 1.0 X-Received: by 10.68.108.163 with SMTP id hl3mr5331311pbb.160.1367332010566; Tue, 30 Apr 2013 07:26:50 -0700 (PDT) Received: by 10.68.241.99 with HTTP; Tue, 30 Apr 2013 07:26:50 -0700 (PDT) In-Reply-To: References: Date: Tue, 30 Apr 2013 16:26:50 +0200 Message-ID: Subject: Re: What about implementing ELM? From: Reto Matter To: dev@mahout.apache.org Content-Type: multipart/alternative; boundary=047d7b86f666c7a9f304db94cae6 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b86f666c7a9f304db94cae6 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hmm, this sounds like a cool idea.... On Tue, Apr 30, 2013 at 4:11 PM, Sean Owen wrote: > I've just skimmed it and so probably missed some key details, but this > looks like a hidden layer model where you just randomly pick values > for the hidden layer parameters, and then solve a simple linear > regression model to predict outputs from the randomized hidden layer. > The random values are never tuned or learned. It sounds too good to be > true at first, and the test results show it does worse on regression > tasks (?) but it gets close and is simple. > > Maybe you could think of it as an ensemble type of approach. You make > a bunch of random projections of the input, each of which is then used > to solve a different regression problem for the same output. Those > answers are combined via weighted that you learn with one step. > > On Tue, Apr 30, 2013 at 2:20 PM, Reto Matter > wrote: > > As far as I understand ELMs, the main difference is that learning in th= at > > particular setting comes down to 3 relatively simple steps and in fact = no > > iteration as in other learning algos (e.g. Backpropagation) is needed. > So, > > in that respect, the learning phase is blazingly fast compared to other > > approaches. > > I don't think they are any better in terms of generalization > capabilities, > > but I haven't studied the theory behind ELMs good enough to really be > > sure... > > > > greets, > > reto > > > > > > On Tue, Apr 30, 2013 at 2:45 PM, Louis H=E9nault >wrote: > > > >> I am not at home where I have my courses note about it, but you can > have a > >> look here for example: > >> http://msrvideo.vo.msecnd.net/rmcvideos/144113/dl/144113.pdf > >> page 50 you have a comparison between SVM and ELM, and ELM outperform > SVM > >> for the testing and training times. > >> > >> It is not easy to give theoretical reasons why ELM are so quick > compared to > >> SVM, but they are. > >> > >> If someone seems to be interested to work on it with me, just tell me. > >> > >> > >> > >> 2013/4/30 Sean Owen > >> > >> > If you care to work on it, you should work on it. Implementations > >> > exist or don't exist because someone created it, or nobody was > >> > interested in creating it. > >> > > >> > I have never heard of 'extreme learning' and found this summary: > >> > > >> > > >> > http://www.slideshare.net/formatc666/extreme-learning-machinetheory-and-a= pplications > >> > > >> > If it's accurate, this is just describing a single hidden layer mode= l > >> > trained with back propagation. I don't see what's new? the part abou= t > >> > learning the beta weights is simple linear algebra. > >> > > >> > If it's just a hidden layer model, it's not necessarily better than > SVMs, > >> > no. > >> > > >> > On Tue, Apr 30, 2013 at 11:05 AM, Louis H=E9nault < > louis.henault@level5.fr > >> > > >> > wrote: > >> > > Hi everybody, > >> > > > >> > > Many people are trying to integrate SVM to Mahout. I can understan= d > >> since > >> > > SVM are really efficient in a "small data" context. > >> > > But, as you may know, SVM has: > >> > > -a slow learning speed > >> > > -a poor learning scalability > >> > > > >> > > In contrast, ELM give results which are usually at least as good a= s > >> SVM's > >> > > and are something like 1000x faster. > >> > > So, why not trying to work on this topic? > >> > > > >> > > (Sorry if someone already talked about it, I'm new on this mailing > and > >> > did > >> > > not find anything after some researches) > >> > > > >> > > Regards > >> > > >> > --047d7b86f666c7a9f304db94cae6--