horn-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Edward J. Yoon (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HORN-8) Implementation of Parameter Server
Date Mon, 09 Nov 2015 03:27:10 GMT

     [ https://issues.apache.org/jira/browse/HORN-8?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Edward J. Yoon updated HORN-8:
------------------------------
    Description: 
The current implementation works in synchronous way like below (SmallLayeredNeuralNetworkTrainer.java
101 lines):
{code}
task0        task1        task2

      compute updates locally

-------------------------------- sends updates to master task
-------------------------------- merge updates and broadcast it to every tasks

      compute updates locally

-------------------------------- sends updates to master task
-------------------------------- merge updates and broadcast it to every tasks
     
               ...

      (Loop until onvergence)
{code}

By separating the master, we can support asynchronous parallel SGD. My idea is just using
of a task0 (BSPTask) as a server daemon. In this issue ticket, single master is enough at
this moment.

{code}
task0     |          task1                          ....   taskN
          |
          |
          |   compute updates locally
          |
 Receive  |<------ push updates to master task
 Update1  |                     
          +------> fetch updates
          |
          |
          |
 Receive  |<------------------------------------ ..
 Update2  |
          +------------------------------------> ..
          |
          |
{code}


  was:
The current implementation works in synchronous way like below:
{code}
task0        task1        task2

      compute updates locally

-------------------------------- sends updates to master task
-------------------------------- merge updates and broadcast it to every tasks

      compute updates locally

-------------------------------- sends updates to master task
-------------------------------- merge updates and broadcast it to every tasks
     
               ...

      (Loop until onvergence)
{code}

By separating the master, we can support asynchronous parallel SGD. My idea is just using
of a task0 (BSPTask) as a server daemon. In this issue ticket, single master is enough at
this moment.

{code}
task0     |          task1                          ....   taskN
          |
          |
          |   compute updates locally
          |
 Receive  |<------ push updates to master task
 Update1  |                     
          +------> fetch updates
          |
          |
          |
 Receive  |<------------------------------------ ..
 Update2  |
          +------------------------------------> ..
          |
          |
{code}



> Implementation of Parameter Server
> ----------------------------------
>
>                 Key: HORN-8
>                 URL: https://issues.apache.org/jira/browse/HORN-8
>             Project: Apache Horn
>          Issue Type: Improvement
>            Reporter: Edward J. Yoon
>
> The current implementation works in synchronous way like below (SmallLayeredNeuralNetworkTrainer.java
101 lines):
> {code}
> task0        task1        task2
>       compute updates locally
> -------------------------------- sends updates to master task
> -------------------------------- merge updates and broadcast it to every tasks
>       compute updates locally
> -------------------------------- sends updates to master task
> -------------------------------- merge updates and broadcast it to every tasks
>      
>                ...
>       (Loop until onvergence)
> {code}
> By separating the master, we can support asynchronous parallel SGD. My idea is just using
of a task0 (BSPTask) as a server daemon. In this issue ticket, single master is enough at
this moment.
> {code}
> task0     |          task1                          ....   taskN
>           |
>           |
>           |   compute updates locally
>           |
>  Receive  |<------ push updates to master task
>  Update1  |                     
>           +------> fetch updates
>           |
>           |
>           |
>  Receive  |<------------------------------------ ..
>  Update2  |
>           +------------------------------------> ..
>           |
>           |
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message