systemml-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nantia Makrynioti <>
Subject Imports not recognized when running PyDML from Spark ML context
Date Tue, 08 Nov 2016 10:48:14 GMT

I am trying to run the PyDML script below using the Spark ML Context.

import systemml as smlimport numpy as npsml.setSparkContext(sc)m1 =
sml.matrix(np.ones((3,3)) + 2)m2 = sml.matrix(np.ones((3,3)) + 3)m2 =
m1 * (m2 + m1)m4 = 1.0 - m2m4.sum(axis=1).toNumPyArray()

I start Spark Shell and create ML context successfully. Then I load the
script from a file using the following command

val s4 = ScriptFactory.pydmlFromFile("test.pydml")

Finally, I execute the script using


The imports are not recognized. I suppose that the first import and setting
the Spark context are not required, since we set up a MLContext after
starting Spark Shell, but what about numpy? I am a bit confused as to what
changes I need to make to run this example.

Thank you in advance for your help,

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message