predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pat Ferrel <>
Subject Re: Can I train and deploy on different machine
Date Tue, 28 Mar 2017 15:34:40 GMT
Spark must be installed locally (so spark-submit will work) but Spark is only used to launch
the PredictionServer. No job is run on Spark for the UR during query serving.

We typically train on a Spark driver machine that is like part of the Spark cluster and deploy
on a server separate from the Spark cluster. This is so that the cluster can be stopped when
not training and no AWS charges are incurred. 

So yes you can and often there are good reasons to do so.

See the Spark overview here: <>

On Mar 27, 2017, at 11:48 PM, Marius Rabenarivo <> wrote:


For the pio train command, I understand that I can use another machine with PIO, Spark Driver,
Master and Worker.

But, is it possible to deploy in a machine without Spark locally installed as it is use spark-submit
during deployment
references sparkContext.

I'm using UR v0.4.2 and PredictionIO 0.10.0



P.S. I also posted in the ActionML Google group forum :!topic/actionml-user/9yNQgVIODvI

View raw message