systemml-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "LI Guobao (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SYSTEMML-2419) Setup and cleanup of remote workers
Date Fri, 06 Jul 2018 15:08:00 GMT

    [ https://issues.apache.org/jira/browse/SYSTEMML-2419?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16534950#comment-16534950
] 

LI Guobao edited comment on SYSTEMML-2419 at 7/6/18 3:07 PM:
-------------------------------------------------------------

[~mboehm7] , I have a problem when serializing the instructions. I got some Spark instruction
which could not be serialized. Hence, my question is that should we recreate the instruction
by forcing the HOPs with CP type. And also, I'd like to know how do Parfor handle this case?
Or it will not generate the SP instructions?
 Here is the stack:
{code:java}
Caused by: org.apache.sysml.runtime.DMLRuntimeException: Not supported: Instructions of type
other than CP instructions org.apache.sysml.runtime.instructions.spark.BinaryMatrixScalarSPInstruction
SPARK°max°0·SCALAR·INT·true°_mVar1279·MATRIX·DOUBLE°_mVar1280·MATRIX·DOUBLE
{code}


was (Author: guobao):
[~mboehm7] , I have a problem when serializing the instructions. I got some Spark instruction
which could not be serialized. Hence, my question is that should we recreate the instruction
by forcing the HOPs with CP type.
Here is the stack:

{code:java}
Caused by: org.apache.sysml.runtime.DMLRuntimeException: Not supported: Instructions of type
other than CP instructions org.apache.sysml.runtime.instructions.spark.BinaryMatrixScalarSPInstruction
SPARK°max°0·SCALAR·INT·true°_mVar1279·MATRIX·DOUBLE°_mVar1280·MATRIX·DOUBLE
{code}


> Setup and cleanup of remote workers
> -----------------------------------
>
>                 Key: SYSTEMML-2419
>                 URL: https://issues.apache.org/jira/browse/SYSTEMML-2419
>             Project: SystemML
>          Issue Type: Sub-task
>            Reporter: LI Guobao
>            Assignee: LI Guobao
>            Priority: Major
>
> In the context of distributed spark env, we need to firstly ship the necessary functions
and variables to the remote workers and then to initialize and register the cleanup of buffer
pool for each remote worker. All these are inspired by the parfor implementation.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message