predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shane Johnson <shanewaldenjohn...@gmail.com>
Subject Re: 32 ERROR Storage$: Error initializing storage client for source PGSQL
Date Sat, 20 May 2017 19:16:21 GMT
Thank you for the quick reply Mars and for the issue creation. I really
appreciate the support. Will this issue persist with MySql and Hbase as
well in `pio-shell`? I'm trying to wrap my mind around $set, $unset events
and run queries like this. Can you think of other ways to manually test the
query, either using MySQL or Hbase with `pio-shell` or using something
other than `pio-shell`.


   - PEventStore.aggregateProperties(appName=appName,
   entityType="user")(sc).collect()
   - PEventStore.aggregateProperties(appName=appName, entityType="user",
   untilTime=Some(new DateTime(2014, 9, 11, 0, 0)))(sc).collect()

Thanks

On Sat, May 20, 2017 at 11:13 AM Mars Hall <mars@heroku.com> wrote:

> Hi Shane,
>
> Unfortunately `pio-shell` currently has class loading/classpath issues.
>
> Thanks for reminding me that an issue needed to be created. Here it is:
>   https://issues.apache.org/jira/browse/PIO-72
>
> *Mars
>
> ( <> .. <> )
>
> > On May 20, 2017, at 09:43, Shane Johnson <shanewaldenjohnson@gmail.com>
> wrote:
> >
> > Team,
> >
> > I am trying to follow the event modeling "MyTestApp" tutorial and am
> having issues querying the data from postegres. Has anyone run into this
> error. Postgres is working fine when I run the models but I am having
> issues connecting to it through the PIO Shell.
> >
> > <image.png>
> >
> > java.lang.ClassNotFoundException: jdbc.StorageClient
> >
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> >
> > at java.lang.Class.forName0(Native Method)
> >
> > at java.lang.Class.forName(Class.java:264)
> >
> > at org.apache.predictionio.data.storage.Storage$.getClient(
> Storage.scala:228)
> >
> > at org.apache.predictionio.data.storage.Storage$.org$apache$
> predictionio$data$storage$Storage$$updateS2CM(Storage.scala:254)
> >
> > at org.apache.predictionio.data.storage.Storage$$anonfun$
> sourcesToClientMeta$1.apply(Storage.scala:215)
> >
> > at org.apache.predictionio.data.storage.Storage$$anonfun$
> sourcesToClientMeta$1.apply(Storage.scala:215)
> >
> > at scala.collection.mutable.MapLike$class.getOrElseUpdate(
> MapLike.scala:189)
> >
> > at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91)
> >
> > at org.apache.predictionio.data.storage.Storage$.
> sourcesToClientMeta(Storage.scala:215)
> >
> > at org.apache.predictionio.data.storage.Storage$.
> getDataObject(Storage.scala:284)
> >
> > at org.apache.predictionio.data.storage.Storage$.
> getDataObjectFromRepo(Storage.scala:269)
> >
> > at org.apache.predictionio.data.storage.Storage$.
> getMetaDataApps(Storage.scala:387)
> >
> > at org.apache.predictionio.data.store.Common$.appsDb$
> lzycompute(Common.scala:27)
> >
> > at org.apache.predictionio.data.store.Common$.appsDb(Common.scala:27)
> >
> > at org.apache.predictionio.data.store.Common$.appNameToId(
> Common.scala:32)
> >
> > at org.apache.predictionio.data.store.PEventStore$.aggregateProperties(
> PEventStore.scala:108)
> >
> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>
> (<console>:31)
> >
> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
> >
> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
> >
> > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
> >
> > at $line20.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
> >
> > at $line20.$read$$iwC$$iwC$$iwC.<init>(<console>:44)
> >
> > at $line20.$read$$iwC$$iwC.<init>(<console>:46)
> >
> > at $line20.$read$$iwC.<init>(<console>:48)
> >
> > at $line20.$read.<init>(<console>:50)
> >
> > at $line20.$read$.<init>(<console>:54)
> >
> > at $line20.$read$.<clinit>(<console>)
> >
> > at $line20.$eval$.<init>(<console>:7)
> >
> > at $line20.$eval$.<clinit>(<console>)
> >
> > at $line20.$eval.$print(<console>)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> >
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(
> SparkIMain.scala:1065)
> >
> > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(
> SparkIMain.scala:1346)
> >
> > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(
> SparkIMain.scala:840)
> >
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> >
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> >
> > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(
> SparkILoop.scala:857)
> >
> > at org.apache.spark.repl.SparkILoop.interpretStartingWith(
> SparkILoop.scala:902)
> >
> > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> >
> > at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
> >
> > at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
> >
> > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$loop(SparkILoop.scala:670)
> >
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
> >
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >
> > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(
> ScalaClassLoader.scala:135)
> >
> > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$process(SparkILoop.scala:945)
> >
> > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> >
> > at org.apache.spark.repl.Main$.main(Main.scala:31)
> >
> > at org.apache.spark.repl.Main.main(Main.scala)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> >
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> >
> > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(
> SparkSubmit.scala:181)
> >
> > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> >
> > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> >
> > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >
> > org.apache.predictionio.data.storage.StorageClientException: Data
> source PGSQL was not properly initialized.
> >
> > at org.apache.predictionio.data.storage.Storage$$anonfun$10.
> apply(Storage.scala:285)
> >
> > at org.apache.predictionio.data.storage.Storage$$anonfun$10.
> apply(Storage.scala:285)
> >
> > at scala.Option.getOrElse(Option.scala:120)
> >
> > at org.apache.predictionio.data.storage.Storage$.
> getDataObject(Storage.scala:284)
> >
> > at org.apache.predictionio.data.storage.Storage$.
> getDataObjectFromRepo(Storage.scala:269)
> >
> > at org.apache.predictionio.data.storage.Storage$.
> getMetaDataApps(Storage.scala:387)
> >
> > at org.apache.predictionio.data.store.Common$.appsDb$
> lzycompute(Common.scala:27)
> >
> > at org.apache.predictionio.data.store.Common$.appsDb(Common.scala:27)
> >
> > at org.apache.predictionio.data.store.Common$.appNameToId(
> Common.scala:32)
> >
> > at org.apache.predictionio.data.store.PEventStore$.aggregateProperties(
> PEventStore.scala:108)
> >
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
> >
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
> >
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
> >
> > at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
> >
> > at $iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
> >
> > at $iwC$$iwC$$iwC.<init>(<console>:44)
> >
> > at $iwC$$iwC.<init>(<console>:46)
> >
> > at $iwC.<init>(<console>:48)
> >
> > at <init>(<console>:50)
> >
> > at .<init>(<console>:54)
> >
> > at .<clinit>(<console>)
> >
> > at .<init>(<console>:7)
> >
> > at .<clinit>(<console>)
> >
> > at $print(<console>)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> >
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(
> SparkIMain.scala:1065)
> >
> > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(
> SparkIMain.scala:1346)
> >
> > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(
> SparkIMain.scala:840)
> >
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> >
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> >
> > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(
> SparkILoop.scala:857)
> >
> > at org.apache.spark.repl.SparkILoop.interpretStartingWith(
> SparkILoop.scala:902)
> >
> > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> >
> > at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
> >
> > at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
> >
> > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$loop(SparkILoop.scala:670)
> >
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
> >
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >
> > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >
> > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(
> ScalaClassLoader.scala:135)
> >
> > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$process(SparkILoop.scala:945)
> >
> > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> >
> > at org.apache.spark.repl.Main$.main(Main.scala:31)
> >
> > at org.apache.spark.repl.Main.main(Main.scala)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> >
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> >
> > at java.lang.reflect.Method.invoke(Method.java:498)
> >
> > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> >
> > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(
> SparkSubmit.scala:181)
> >
> > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> >
> > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> >
> > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >
> > Shane Johnson | 801.360.3350 <(801)%20360-3350>
> >
> > LinkedIn | Facebook
>
>

Mime
View raw message