Return-Path: X-Original-To: apmail-hama-dev-archive@www.apache.org Delivered-To: apmail-hama-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id AEA4F105D0 for ; Fri, 7 Jun 2013 02:37:31 +0000 (UTC) Received: (qmail 90704 invoked by uid 500); 7 Jun 2013 02:37:31 -0000 Delivered-To: apmail-hama-dev-archive@hama.apache.org Received: (qmail 90254 invoked by uid 500); 7 Jun 2013 02:37:26 -0000 Mailing-List: contact dev-help@hama.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hama.apache.org Delivered-To: mailing list dev@hama.apache.org Received: (qmail 89237 invoked by uid 99); 7 Jun 2013 02:37:24 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 07 Jun 2013 02:37:24 +0000 Date: Fri, 7 Jun 2013 02:37:24 +0000 (UTC) From: "Hudson (JIRA)" To: dev@hama.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HAMA-681) Multi Layer Perceptron MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HAMA-681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13677771#comment-13677771 ] Hudson commented on HAMA-681: ----------------------------- Integrated in Hama-Nightly #933 (See [https://builds.apache.org/job/Hama-Nightly/933/]) HAMA-681: Add Multi Layer Perceptron (Yexi Jiang via edwardyoon) (Revision 1490122) Result = SUCCESS edwardyoon : Files : * /hama/trunk/CHANGES.txt * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/CostFunction.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/CostFunctionFactory.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/LogisticCostFunction.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/MLPMessage.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/MultiLayerPerceptron.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/PerceptronTrainer.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/Sigmoid.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/SmallMLPMessage.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/SmallMLPTrainer.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/SmallMultiLayerPerceptron.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/SquaredError.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/SquashingFunction.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/SquashingFunctionFactory.java * /hama/trunk/ml/src/main/java/org/apache/hama/ml/perception/Tanh.java * /hama/trunk/ml/src/test/java/org/apache/hama/ml/perception * /hama/trunk/ml/src/test/java/org/apache/hama/ml/perception/TestSmallMLPMessage.java * /hama/trunk/ml/src/test/java/org/apache/hama/ml/perception/TestSmallMultiLayerPerceptron.java > Multi Layer Perceptron > ----------------------- > > Key: HAMA-681 > URL: https://issues.apache.org/jira/browse/HAMA-681 > Project: Hama > Issue Type: New Feature > Components: machine learning > Reporter: Christian Herta > Assignee: Yexi Jiang > Labels: patch, perceptron > Fix For: 0.6.2 > > Attachments: HAMA-681.patch, HAMA-681.patch, HAMA-681.patch, perception.patch > > > Implementation of a Multilayer Perceptron (Neural Network) > - Learning by Backpropagation > - Distributed Learning > The implementation should be the basis for the long range goals: > - more efficent learning (Adagrad, L-BFGS) > - High efficient distributed Learning > - Autoencoder - Sparse (denoising) Autoencoder > - Deep Learning > > --- > Due to the overhead of Map-Reduce(MR) MR didn't seem to be the best strategy to distribute the learning of MLPs. > Therefore the current implementation of the MLP (see MAHOUT-976) should be migrated to Hama. First all dependencies to Mahout (Matrix-Library) must be removed to get a standalone MLP Implementation. Then the Hama BSP programming model should be used to realize distributed learning. > Different strategies of efficient synchronized weight updates has to be evaluated. > Resources: > Videos: > - http://www.youtube.com/watch?v=ZmNOAtZIgIk > - http://techtalks.tv/talks/57639/ > MLP and Deep Learning Tutorial: > - http://www.stanford.edu/class/cs294a/ > Scientific Papers: > - Google's "Brain" project: > http://research.google.com/archive/large_deep_networks_nips2012.html > - Neural Networks and BSP: http://ipdps.cc.gatech.edu/1998/biosp3/bispp4.pdf > - http://jmlr.csail.mit.edu/papers/volume11/vincent10a/vincent10a.pdf -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira