mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yexi Jiang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (MAHOUT-1551) Add document to describe how to use mlp with command line
Date Mon, 14 Jul 2014 18:46:06 GMT

    [ https://issues.apache.org/jira/browse/MAHOUT-1551?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14061011#comment-14061011
] 

Yexi Jiang commented on MAHOUT-1551:
------------------------------------

[~fschueler], you are correct, the current implementation is a simple online/stochastic gradient
descent network using back-propagation for optimizing. The user can set the number of levels,
number of neurons in each level, and a various of parameters (such as learning rate, regularization
weight, etc.). The CLI version simplifies some parameters because basic users do not need
that many parameters.

Regards,
Yexi

> Add document to describe how to use mlp with command line
> ---------------------------------------------------------
>
>                 Key: MAHOUT-1551
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1551
>             Project: Mahout
>          Issue Type: Documentation
>          Components: Classification, CLI, Documentation
>    Affects Versions: 0.9
>            Reporter: Yexi Jiang
>              Labels: documentation
>             Fix For: 1.0
>
>         Attachments: README.md
>
>
> Add documentation about the usage of multi-layer perceptron in command line.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message