predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shane Johnson <shanewaldenjohn...@gmail.com>
Subject Re: 32 ERROR Storage$: Error initializing storage client for source PGSQL
Date Mon, 22 May 2017 14:08:01 GMT
Thanks Chan,

I added  `$SPARK_HOME/bin/spark-shell --jars $ASSEMBLY_JAR*:$JDBC_JAR *$@`
but did not have success. You mentioned the $JDBC_JAR needs to be in
$PIO_HOME/lib/spark/. Here is my current structure. It looks like the
'pio-data-jdbc-assembly-0.11.0-incubating.jar' is in $PIO_HOME/lib/spark/
when the package is installed.

[image: Inline image 1]

Can you expound on what you mean by `where $JDBC_JAR is in
$PIO_HOME/lib/spark/`

Does this need to be packaged up similar to what is happening here?


 echo "Starting the PIO shell with the Apache Spark Shell."
  # compute the $ASSEMPLY_JAR, the location of the assemply jar, with
  # bin/compute-classpath.sh
  . ${PIO_HOME}/bin/compute-classpath.sh

Thanks for your help and support!


*Shane Johnson | 801.360.3350*
LinkedIn <https://www.linkedin.com/in/shanewjohnson> | Facebook
<https://www.facebook.com/shane.johnson.71653>

On Mon, May 22, 2017 at 4:06 AM, Chan Lee <chanlee514@gmail.com> wrote:

> Hi Shane,
>
> You'd want to do `$SPARK_HOME/bin/spark-shell --jars $ASSEMBLY_JAR*:$JDBC_JAR
> *$@`,
>
> where $JDBC_JAR is in $PIO_HOME/lib/spark/
>
> Best,
> Chan
>
>
>
> On Sat, May 20, 2017 at 4:33 PM, Shane Johnson <
> shanewaldenjohnson@gmail.com> wrote:
>
>> Thanks Donald,
>>
>> I've tried a couple approaches. Each time I have exited and restarted
>> `pio-shell`. Am I going down the right path by adding the PSQL_JAR after
>> the ASSEMBLY_JAR or is it more extensive?
>>
>> Thank you for you help and support!
>>
>> Unsuccessful attempts:
>>
>>
>> then
>>   echo "Starting the PIO shell with the Apache Spark Shell."
>>   # compute the $ASSEMPLY_JAR, the location of the assemply jar, with
>>   # bin/compute-classpath.sh
>>   . ${PIO_HOME}/bin/compute-classpath.sh
>>   shift
>>   $SPARK_HOME/bin/spark-shell --jars $ASSEMBLY_JAR $@
>> $POSTGRES_JDBC_DRIVER
>>
>> then
>>   echo "Starting the PIO shell with the Apache Spark Shell."
>>   # compute the $ASSEMPLY_JAR, the location of the assemply jar, with
>>   # bin/compute-classpath.sh
>>   . ${PIO_HOME}/bin/compute-classpath.sh
>>   shift
>>   $SPARK_HOME/bin/spark-shell --jars $ASSEMBLY_JAR $@ $PSQL_JAR
>>
>>
>> *Shane Johnson | 801.360.3350 <(801)%20360-3350>*
>> LinkedIn <https://www.linkedin.com/in/shanewjohnson> | Facebook
>> <https://www.facebook.com/shane.johnson.71653>
>>
>> On Sat, May 20, 2017 at 2:46 PM, Donald Szeto <donald@apache.org> wrote:
>>
>>> Hey Shane,
>>>
>>> One quick workaround is to manually edit this line for now:
>>>
>>> https://github.com/apache/incubator-predictionio/blob/develo
>>> p/bin/pio-shell#L62
>>>
>>> and adding the JDBC assembly JAR after the main assembly JAR.
>>>
>>> Sorry for the brief reply as I'm traveling. I will follow up with more
>>> details when I find a chance.
>>>
>>> Regards,
>>> Donald
>>>
>>> On Sat, May 20, 2017 at 12:16 PM Shane Johnson <
>>> shanewaldenjohnson@gmail.com> wrote:
>>>
>>>> Thank you for the quick reply Mars and for the issue creation. I really
>>>> appreciate the support. Will this issue persist with MySql and Hbase as
>>>> well in `pio-shell`? I'm trying to wrap my mind around $set, $unset events
>>>> and run queries like this. Can you think of other ways to manually test the
>>>> query, either using MySQL or Hbase with `pio-shell` or using something
>>>> other than `pio-shell`.
>>>>
>>>>
>>>>    - PEventStore.aggregateProperties(appName=appName,
>>>>    entityType="user")(sc).collect()
>>>>    - PEventStore.aggregateProperties(appName=appName,
>>>>    entityType="user", untilTime=Some(new DateTime(2014, 9, 11, 0,
>>>>    0)))(sc).collect()
>>>>
>>>> Thanks
>>>>
>>>> On Sat, May 20, 2017 at 11:13 AM Mars Hall <mars@heroku.com> wrote:
>>>>
>>>>> Hi Shane,
>>>>>
>>>>> Unfortunately `pio-shell` currently has class loading/classpath issues.
>>>>>
>>>>> Thanks for reminding me that an issue needed to be created. Here it is:
>>>>>   https://issues.apache.org/jira/browse/PIO-72
>>>>>
>>>>> *Mars
>>>>>
>>>>> ( <> .. <> )
>>>>>
>>>>> > On May 20, 2017, at 09:43, Shane Johnson <
>>>>> shanewaldenjohnson@gmail.com> wrote:
>>>>> >
>>>>> > Team,
>>>>> >
>>>>> > I am trying to follow the event modeling "MyTestApp" tutorial and
am
>>>>> having issues querying the data from postegres. Has anyone run into this
>>>>> error. Postgres is working fine when I run the models but I am having
>>>>> issues connecting to it through the PIO Shell.
>>>>> >
>>>>> > <image.png>
>>>>> >
>>>>> > java.lang.ClassNotFoundException: jdbc.StorageClient
>>>>> >
>>>>> > at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>> >
>>>>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>> >
>>>>> > at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>> >
>>>>> > at java.lang.Class.forName0(Native Method)
>>>>> >
>>>>> > at java.lang.Class.forName(Class.java:264)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.getClient(Stor
>>>>> age.scala:228)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.org$apache$pre
>>>>> dictionio$data$storage$Storage$$updateS2CM(Storage.scala:254)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$$anonfun$source
>>>>> sToClientMeta$1.apply(Storage.scala:215)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$$anonfun$source
>>>>> sToClientMeta$1.apply(Storage.scala:215)
>>>>> >
>>>>> > at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLi
>>>>> ke.scala:189)
>>>>> >
>>>>> > at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.sca
>>>>> la:91)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.sourcesToClien
>>>>> tMeta(Storage.scala:215)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.getDataObject(
>>>>> Storage.scala:284)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.getDataObjectF
>>>>> romRepo(Storage.scala:269)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.getMetaDataApp
>>>>> s(Storage.scala:387)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.Common$.appsDb$lzycompute
>>>>> (Common.scala:27)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.Common$.appsDb(Common.sca
>>>>> la:27)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.Common$.appNameToId(Commo
>>>>> n.scala:32)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.PEventStore$.aggregatePro
>>>>> perties(PEventStore.scala:108)
>>>>> >
>>>>> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>
>>>>> (<console>:31)
>>>>> >
>>>>> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<con
>>>>> sole>:36)
>>>>> >
>>>>> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
>>>>> >
>>>>> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
>>>>> >
>>>>> > at $line20.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
>>>>> >
>>>>> > at $line20.$read$$iwC$$iwC$$iwC.<init>(<console>:44)
>>>>> >
>>>>> > at $line20.$read$$iwC$$iwC.<init>(<console>:46)
>>>>> >
>>>>> > at $line20.$read$$iwC.<init>(<console>:48)
>>>>> >
>>>>> > at $line20.$read.<init>(<console>:50)
>>>>> >
>>>>> > at $line20.$read$.<init>(<console>:54)
>>>>> >
>>>>> > at $line20.$read$.<clinit>(<console>)
>>>>> >
>>>>> > at $line20.$eval$.<init>(<console>:7)
>>>>> >
>>>>> > at $line20.$eval$.<clinit>(<console>)
>>>>> >
>>>>> > at $line20.$eval.$print(<console>)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>>>> ssorImpl.java:62)
>>>>> >
>>>>> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>>>> thodAccessorImpl.java:43)
>>>>> >
>>>>> > at java.lang.reflect.Method.invoke(Method.java:498)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMa
>>>>> in.scala:1065)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMa
>>>>> in.scala:1346)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.
>>>>> scala:840)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoo
>>>>> p.scala:857)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.interpretStartingWith(Spark
>>>>> ILoop.scala:902)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.sc
>>>>> ala:657)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scal
>>>>> a:665)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$Spark
>>>>> ILoop$$loop(SparkILoop.scala:670)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$r
>>>>> epl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$r
>>>>> epl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$r
>>>>> epl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>> >
>>>>> > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(S
>>>>> calaClassLoader.scala:135)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$Spark
>>>>> ILoop$$process(SparkILoop.scala:945)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>>> >
>>>>> > at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>>> >
>>>>> > at org.apache.spark.repl.Main.main(Main.scala)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>>>> ssorImpl.java:62)
>>>>> >
>>>>> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>>>> thodAccessorImpl.java:43)
>>>>> >
>>>>> > at java.lang.reflect.Method.invoke(Method.java:498)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
>>>>> $SparkSubmit$$runMain(SparkSubmit.scala:731)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit
>>>>> .scala:181)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scal
>>>>> a:206)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>> >
>>>>> > org.apache.predictionio.data.storage.StorageClientException: Data
>>>>> source PGSQL was not properly initialized.
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$$anonfun$10.app
>>>>> ly(Storage.scala:285)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$$anonfun$10.app
>>>>> ly(Storage.scala:285)
>>>>> >
>>>>> > at scala.Option.getOrElse(Option.scala:120)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.getDataObject(
>>>>> Storage.scala:284)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.getDataObjectF
>>>>> romRepo(Storage.scala:269)
>>>>> >
>>>>> > at org.apache.predictionio.data.storage.Storage$.getMetaDataApp
>>>>> s(Storage.scala:387)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.Common$.appsDb$lzycompute
>>>>> (Common.scala:27)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.Common$.appsDb(Common.sca
>>>>> la:27)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.Common$.appNameToId(Commo
>>>>> n.scala:32)
>>>>> >
>>>>> > at org.apache.predictionio.data.store.PEventStore$.aggregatePro
>>>>> perties(PEventStore.scala:108)
>>>>> >
>>>>> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>>>>> >
>>>>> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
>>>>> >
>>>>> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
>>>>> >
>>>>> > at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
>>>>> >
>>>>> > at $iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
>>>>> >
>>>>> > at $iwC$$iwC$$iwC.<init>(<console>:44)
>>>>> >
>>>>> > at $iwC$$iwC.<init>(<console>:46)
>>>>> >
>>>>> > at $iwC.<init>(<console>:48)
>>>>> >
>>>>> > at <init>(<console>:50)
>>>>> >
>>>>> > at .<init>(<console>:54)
>>>>> >
>>>>> > at .<clinit>(<console>)
>>>>> >
>>>>> > at .<init>(<console>:7)
>>>>> >
>>>>> > at .<clinit>(<console>)
>>>>> >
>>>>> > at $print(<console>)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>>>> ssorImpl.java:62)
>>>>> >
>>>>> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>>>> thodAccessorImpl.java:43)
>>>>> >
>>>>> > at java.lang.reflect.Method.invoke(Method.java:498)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMa
>>>>> in.scala:1065)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMa
>>>>> in.scala:1346)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.
>>>>> scala:840)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoo
>>>>> p.scala:857)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.interpretStartingWith(Spark
>>>>> ILoop.scala:902)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.sc
>>>>> ala:657)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scal
>>>>> a:665)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$Spark
>>>>> ILoop$$loop(SparkILoop.scala:670)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$r
>>>>> epl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$r
>>>>> epl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$r
>>>>> epl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>> >
>>>>> > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(S
>>>>> calaClassLoader.scala:135)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$Spark
>>>>> ILoop$$process(SparkILoop.scala:945)
>>>>> >
>>>>> > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>>> >
>>>>> > at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>>> >
>>>>> > at org.apache.spark.repl.Main.main(Main.scala)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> >
>>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>>>> ssorImpl.java:62)
>>>>> >
>>>>> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>>>> thodAccessorImpl.java:43)
>>>>> >
>>>>> > at java.lang.reflect.Method.invoke(Method.java:498)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
>>>>> $SparkSubmit$$runMain(SparkSubmit.scala:731)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit
>>>>> .scala:181)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scal
>>>>> a:206)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>> >
>>>>> > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>> >
>>>>> > Shane Johnson | 801.360.3350 <(801)%20360-3350>
>>>>> >
>>>>> > LinkedIn | Facebook
>>>>>
>>>>>
>>
>

Mime
View raw message