predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marius Rabenarivo <mariusrabenar...@gmail.com>
Subject Re: Need Help Building
Date Sun, 26 Mar 2017 18:37:57 GMT
 No, you don't have to build again if you didn't change code or engine.json

2017-03-26 22:28 GMT+04:00 Vaghawan Ojha <vaghawan781@gmail.com>:

> Yes, you were right! It's because of the owner of the dir isn't the same
> as pio command. Can you tell me, do I have to do pio build, again whenever
> I updated my events data for new train?
>
> Thank you
> Vaghawan
>
> On Mon, Mar 27, 2017 at 12:11 AM, Marius Rabenarivo <
> mariusrabenarivo@gmail.com> wrote:
>
>> You can set JAVA_OPTS inside .profile in your home directory
>>
>> Add
>>
>> export JAVA_OPTS="Xmx4g"
>>
>> inside your .profile or .bashrc
>>
>> Access to LOCALFS is due to OS privilege I think.
>>
>> The owner of the directory should be the same as the runner of the pio
>> command
>>
>> 2017-03-26 22:13 GMT+04:00 Vaghawan Ojha <vaghawan781@gmail.com>:
>>
>>> the plain error is like this:
>>>
>>> [ERROR] [Storage$] Error initializing storage client for source LOCALFS
>>> Exception in thread "main" org.apache.predictionio.data.storage.StorageClientException:
>>> Data source LOCALFS was not properly initialized.
>>> at org.apache.predictionio.data.storage.Storage$$anonfun$10.app
>>> ly(Storage.scala:282)
>>> at org.apache.predictionio.data.storage.Storage$$anonfun$10.app
>>> ly(Storage.scala:282)
>>> at scala.Option.getOrElse(Option.scala:120)
>>> at org.apache.predictionio.data.storage.Storage$.getDataObject(
>>> Storage.scala:281)
>>> at org.apache.predictionio.data.storage.Storage$.getDataObjectF
>>> romRepo(Storage.scala:266)
>>> at org.apache.predictionio.data.storage.Storage$.getModelDataMo
>>> dels(Storage.scala:382)
>>> at org.apache.predictionio.workflow.CoreWorkflow$.runTrain(Core
>>> Workflow.scala:79)
>>> at org.apache.predictionio.workflow.CreateWorkflow$.main(Create
>>> Workflow.scala:250)
>>> at org.apache.predictionio.workflow.CreateWorkflow.main(CreateW
>>> orkflow.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>> ssorImpl.java:62)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>> thodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
>>> $SparkSubmit$$runMain(SparkSubmit.scala:672)
>>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit
>>> .scala:180)
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> Because without doing sudo that LOCALFS doesn't execute correctly. I
>>> don't know how else could I run it. Thanks
>>>
>>> On Sun, Mar 26, 2017 at 11:56 PM, Vaghawan Ojha <vaghawan781@gmail.com>
>>> wrote:
>>>
>>>> Where would I set JAVA_OPTS is it in pio-env.sh?
>>>>
>>>> On Sun, Mar 26, 2017 at 11:49 PM, Marius Rabenarivo <
>>>> mariusrabenarivo@gmail.com> wrote:
>>>>
>>>>> You have do add pass-through parameters to the pio train command
>>>>>
>>>>> pio train -- --executor-memory 4g --driver-memory 4g
>>>>>
>>>>> and set JAVA_OPTS="Xmx4g" environment variable
>>>>>
>>>>> 2017-03-26 21:37 GMT+04:00 Vaghawan Ojha <vaghawan781@gmail.com>:
>>>>>
>>>>>> Hi,
>>>>>> Thanks but the error was because I was not inside the template dir
>>>>>> while running pio build. It builded now successfully, but it seems
in every
>>>>>> step there is some crazy errors awaiting for me. Now it actually
fails at
>>>>>> training. Can you suggest me anything from the train log?
>>>>>> I'm sorry but they are really hard to grab unless I ask for help.
>>>>>>
>>>>>> Thank you very much
>>>>>>
>>>>>> On Sun, Mar 26, 2017 at 10:00 PM, Marius Rabenarivo <
>>>>>> mariusrabenarivo@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> The error is :
>>>>>>>
>>>>>>> [ERROR] [Storage$] Error initializing storage client for source
PGSQL
>>>>>>>
>>>>>>> I think you need to change it to HBASE if you want to use HBase
>>>>>>>
>>>>>>> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=PGSQL
>>>>>>> ->
>>>>>>> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE
>>>>>>>
>>>>>>> in your pio-env.sh
>>>>>>>
>>>>>>> And start HBase before if not using the pio-start-all script.
>>>>>>>
>>>>>>> If you want to use PostreSQL pio-start-all attempt to start it
too.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> 2017-03-26 19:29 GMT+04:00 Vaghawan Ojha <vaghawan781@gmail.com>:
>>>>>>>
>>>>>>>> I followed the procedure of manual install, everything was
fine
>>>>>>>> until I stumbled into the pio build.
>>>>>>>>
>>>>>>>> I've a directory something like this /abc/pio0.0.10/pio and
inside
>>>>>>>> that another dir pio, in total it would be like :
>>>>>>>> /abc/pio0.0.10/pio /
>>>>>>>>
>>>>>>>> where do I actually run build? inside /abc/pio0.0.10 or
>>>>>>>> /abc/pio0.0.10/pio / ?
>>>>>>>>
>>>>>>>> I don't know but I get some weird errors which I can't properly
>>>>>>>> diagnose. I"ve attached my log file here. I've followed to
load the engine
>>>>>>>> template. here http://predictionio.incub
>>>>>>>> ator.apache.org/templates/recommendation/quickstart/
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Mime
View raw message