predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pat Ferrel <...@occamsmachete.com>
Subject Re: Can I train and deploy on different machine
Date Tue, 28 Mar 2017 15:34:40 GMT
Spark must be installed locally (so spark-submit will work) but Spark is only used to launch
the PredictionServer. No job is run on Spark for the UR during query serving.

We typically train on a Spark driver machine that is like part of the Spark cluster and deploy
on a server separate from the Spark cluster. This is so that the cluster can be stopped when
not training and no AWS charges are incurred. 

So yes you can and often there are good reasons to do so.

See the Spark overview here: http://actionml.com/docs/intro_to_spark <http://actionml.com/docs/intro_to_spark>


On Mar 27, 2017, at 11:48 PM, Marius Rabenarivo <mariusrabenarivo@gmail.com> wrote:

Hello,

For the pio train command, I understand that I can use another machine with PIO, Spark Driver,
Master and Worker.

But, is it possible to deploy in a machine without Spark locally installed as it is use spark-submit
during deployment
and 
org.apache.predictionio.workflow.CreateServer
references sparkContext.

I'm using UR v0.4.2 and PredictionIO 0.10.0

Regards,

Marius

P.S. I also posted in the ActionML Google group forum : https://groups.google.com/forum/#!topic/actionml-user/9yNQgVIODvI
<https://groups.google.com/forum/#!topic/actionml-user/9yNQgVIODvI>

Mime
View raw message