hama-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yexi Jiang (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HAMA-681) Multi Layer Perceptron
Date Wed, 05 Jun 2013 14:50:21 GMT

     [ https://issues.apache.org/jira/browse/HAMA-681?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Yexi Jiang updated HAMA-681:
----------------------------

    Attachment: HAMA-681.patch

It seems that I found the reason of this problem.

As Hama executes the test cases in parallel and two of my test cases generate files with the
same name, these two test cases would corrupt the files for each other when they are executed
at the same time.

I resolved it by using different file names and tested it on two different machines. Hopefully
it would pass on yours.

Note:
The test command I used is: 
mvn -Dtest=org.apache.hama.ml.perception.TestSmallMLPMessage,org.apache.hama.ml.perception.TestSmallMultiLayerPerceptron
test

The output:
[INFO] Apache Hama parent POM ............................ SUCCESS [0.999s]
[INFO] core .............................................. SUCCESS [3.211s]
[INFO] graph ............................................. SUCCESS [0.649s]
[INFO] machine learning .................................. SUCCESS [34.969s]
[INFO] examples .......................................... SUCCESS [0.423s]
[INFO] yarn .............................................. SUCCESS [1.669s]
[INFO] hama-dist ......................................... SUCCESS [0.012s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 42.477s
[INFO] Finished at: Wed Jun 05 10:46:19 EDT 2013
[INFO] Final Memory: 27M/981M
[INFO] ------------------------------------------------------------------------
                
> Multi Layer Perceptron 
> -----------------------
>
>                 Key: HAMA-681
>                 URL: https://issues.apache.org/jira/browse/HAMA-681
>             Project: Hama
>          Issue Type: New Feature
>          Components: machine learning
>            Reporter: Christian Herta
>            Assignee: Yexi Jiang
>              Labels: patch, perceptron
>         Attachments: HAMA-681.patch, HAMA-681.patch, HAMA-681.patch, perception.patch
>
>
> Implementation of a Multilayer Perceptron (Neural Network)
>  - Learning by Backpropagation 
>  - Distributed Learning
> The implementation should be the basis for the long range goals:
>  - more efficent learning (Adagrad, L-BFGS)
>  - High efficient distributed Learning
>  - Autoencoder - Sparse (denoising) Autoencoder
>  - Deep Learning
>  
> ---
> Due to the overhead of Map-Reduce(MR) MR didn't seem to be the best strategy to distribute
the learning of MLPs.
> Therefore the current implementation of the MLP (see MAHOUT-976) should be migrated to
Hama. First all dependencies to Mahout (Matrix-Library) must be removed to get a standalone
MLP Implementation. Then the Hama BSP programming model should be used to realize distributed
learning.
> Different strategies of efficient synchronized weight updates has to be evaluated.
> Resources:
>  Videos:
>     - http://www.youtube.com/watch?v=ZmNOAtZIgIk
>     - http://techtalks.tv/talks/57639/
>  MLP and Deep Learning Tutorial:
>  - http://www.stanford.edu/class/cs294a/
>  Scientific Papers:
>  - Google's "Brain" project: 
> http://research.google.com/archive/large_deep_networks_nips2012.html
>  - Neural Networks and BSP: http://ipdps.cc.gatech.edu/1998/biosp3/bispp4.pdf
>  - http://jmlr.csail.mit.edu/papers/volume11/vincent10a/vincent10a.pdf

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message