predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shane Johnson <shanewaldenjohn...@gmail.com>
Subject Re: 32 ERROR Storage$: Error initializing storage client for source PGSQL
Date Mon, 22 May 2017 22:51:04 GMT
Thank you Mars. This worked and was extremely helpful for me. Thank you
very much!

Shanes-MBP:~ shanejohnson$ pio-shell --with-spark --jars
$PIO_HOME/lib/pio-assembly-0.11.0-incubating.jar,$PIO_HOME/lib/spark/pio-data-jdbc-assembly-0.11.0-incubating.jar,$PIO_HOME/lib/postgresql-42.1.1.jar,$PIO_HOME/vendors/spark-1.6.3-bin-hadoop2.6/lib/spark-assembly-1.6.3-hadoop2.6.0.jar

Starting the PIO shell with the Apache Spark Shell.

Using Spark's repl log4j profile:
org/apache/spark/log4j-defaults-repl.properties

To adjust logging level use sc.setLogLevel("INFO")

Welcome to

      ____              __

     / __/__  ___ _____/ /__

    _\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 1.6.3

      /_/


Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_121)

Type in expressions to have them evaluated.

Type :help for more information.

Spark context available as sc.

SQL context available as sqlContext.


scala> val appName="PTB"

appName: String = PTB


scala> import org.apache.predictionio.data.store.PEventStore

import org.apache.predictionio.data.store.PEventStore


scala> PEventStore.aggregateProperties(appName=appName,
entityType="user")(sc).collect()

res0: Array[(String, org.apache.predictionio.data.storage.PropertyMap)] =
Array((2,PropertyMap(Map(), 2014-09-09T16:17:42.937-08:00,
2014-09-13T16:17:42.143-08:00)))


scala> import org.joda.time.DateTime

import org.joda.time.DateTime


scala> PEventStore.aggregateProperties(appName=appName, entityType="user",
untilTime=Some(new DateTime(2014, 9, 11, 0, 0)))(sc).collect()

res1: Array[(String, org.apache.predictionio.data.storage.PropertyMap)] =
Array((2,PropertyMap(Map(b -> JInt(5), a -> JInt(3), c -> JInt(6)),
2014-09-09T16:17:42.937-08:00, 2014-09-10T13:12:04.937-08:00)))



*Shane Johnson | 801.360.3350*
LinkedIn <https://www.linkedin.com/in/shanewjohnson> | Facebook
<https://www.facebook.com/shane.johnson.71653>

On Mon, May 22, 2017 at 1:03 PM, Mars Hall <mars@heroku.com> wrote:

> Shane,
>
> I just got `pio-shell` to work with both Postgres & Elasticsearch storage
> using the following command:
>
>   pio-shell --with-spark --jars PredictionIO-dist/lib/pio-
> assembly-0.11.0-SNAPSHOT.jar,PredictionIO-dist/lib/postgresql_jdbc.jar,
> PredictionIO-dist/lib/spark/pio-data-elasticsearch-
> assembly-0.11.0-SNAPSHOT.jar,PredictionIO-dist/lib/spark/
> pio-data-jdbc-assembly-0.11.0-SNAPSHOT.jar
>
> These paths will certainly be different for you, unless you're using the
> Heroku buildpack's local dev workflow:
>   https://github.com/heroku/predictionio-buildpack/blob/master/DEV.md
>
> Note that `--jars` is a comma-separated list, not colons.
>
> *Mars
>
> ( <> .. <> )
>
> > On May 22, 2017, at 07:08, Shane Johnson <shanewaldenjohnson@gmail.com>
> wrote:
> >
> > Thanks Chan,
> >
> > I added  `$SPARK_HOME/bin/spark-shell --jars $ASSEMBLY_JAR:$JDBC_JAR $@`
> but did not have success. You mentioned the $JDBC_JAR needs to be in
> $PIO_HOME/lib/spark/. Here is my current structure. It looks like the
> 'pio-data-jdbc-assembly-0.11.0-incubating.jar' is in $PIO_HOME/lib/spark/
> when the package is installed.
> >
> > <image.png>
> >
> > Can you expound on what you mean by `where $JDBC_JAR is in
> $PIO_HOME/lib/spark/`
> >
> > Does this need to be packaged up similar to what is happening here?
> >
> >
> >  echo "Starting the PIO shell with the Apache Spark Shell."
> >   # compute the $ASSEMPLY_JAR, the location of the assemply jar, with
> >   # bin/compute-classpath.sh
> >   . ${PIO_HOME}/bin/compute-classpath.sh
> >
> > Thanks for your help and support!
> >
> >
> > Shane Johnson | 801.360.3350
> >
> > LinkedIn | Facebook
> >
> > On Mon, May 22, 2017 at 4:06 AM, Chan Lee <chanlee514@gmail.com> wrote:
> > Hi Shane,
> >
> > You'd want to do `$SPARK_HOME/bin/spark-shell --jars
> $ASSEMBLY_JAR:$JDBC_JAR $@`,
> >
> > where $JDBC_JAR is in $PIO_HOME/lib/spark/
> >
> > Best,
> > Chan
> >
> >
> >
> > On Sat, May 20, 2017 at 4:33 PM, Shane Johnson <
> shanewaldenjohnson@gmail.com> wrote:
> > Thanks Donald,
> >
> > I've tried a couple approaches. Each time I have exited and restarted
> `pio-shell`. Am I going down the right path by adding the PSQL_JAR after
> the ASSEMBLY_JAR or is it more extensive?
> >
> > Thank you for you help and support!
> >
> > Unsuccessful attempts:
> >
> >
> > then
> >   echo "Starting the PIO shell with the Apache Spark Shell."
> >   # compute the $ASSEMPLY_JAR, the location of the assemply jar, with
> >   # bin/compute-classpath.sh
> >   . ${PIO_HOME}/bin/compute-classpath.sh
> >   shift
> >   $SPARK_HOME/bin/spark-shell --jars $ASSEMBLY_JAR $@
> $POSTGRES_JDBC_DRIVER
> >
> > then
> >   echo "Starting the PIO shell with the Apache Spark Shell."
> >   # compute the $ASSEMPLY_JAR, the location of the assemply jar, with
> >   # bin/compute-classpath.sh
> >   . ${PIO_HOME}/bin/compute-classpath.sh
> >   shift
> >   $SPARK_HOME/bin/spark-shell --jars $ASSEMBLY_JAR $@ $PSQL_JAR
> >
> >
> > Shane Johnson | 801.360.3350
> >
> > LinkedIn | Facebook
> >
> > On Sat, May 20, 2017 at 2:46 PM, Donald Szeto <donald@apache.org> wrote:
> > Hey Shane,
> >
> > One quick workaround is to manually edit this line for now:
> >
> > https://github.com/apache/incubator-predictionio/blob/
> develop/bin/pio-shell#L62
> >
> > and adding the JDBC assembly JAR after the main assembly JAR.
> >
> > Sorry for the brief reply as I'm traveling. I will follow up with more
> details when I find a chance.
> >
> > Regards,
> > Donald
> >
> > On Sat, May 20, 2017 at 12:16 PM Shane Johnson <
> shanewaldenjohnson@gmail.com> wrote:
> > Thank you for the quick reply Mars and for the issue creation. I really
> appreciate the support. Will this issue persist with MySql and Hbase as
> well in `pio-shell`? I'm trying to wrap my mind around $set, $unset events
> and run queries like this. Can you think of other ways to manually test the
> query, either using MySQL or Hbase with `pio-shell` or using something
> other than `pio-shell`.
> >
> >       • PEventStore.aggregateProperties(appName=appName,
> entityType="user")(sc).collect()
> >       • PEventStore.aggregateProperties(appName=appName,
> entityType="user", untilTime=Some(new DateTime(2014, 9, 11, 0,
> 0)))(sc).collect()
> > Thanks
> >
> > On Sat, May 20, 2017 at 11:13 AM Mars Hall <mars@heroku.com> wrote:
> > Hi Shane,
> >
> > Unfortunately `pio-shell` currently has class loading/classpath issues.
> >
> > Thanks for reminding me that an issue needed to be created. Here it is:
> >   https://issues.apache.org/jira/browse/PIO-72
> >
> > *Mars
> >
> > ( <> .. <> )
> >
> > > On May 20, 2017, at 09:43, Shane Johnson <shanewaldenjohnson@gmail.com>
> wrote:
> > >
> > > Team,
> > >
> > > I am trying to follow the event modeling "MyTestApp" tutorial and am
> having issues querying the data from postegres. Has anyone run into this
> error. Postgres is working fine when I run the models but I am having
> issues connecting to it through the PIO Shell.
> > >
> > > <image.png>
> > >
> > > java.lang.ClassNotFoundException: jdbc.StorageClient
> > >
> > > at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> > >
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> > >
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> > >
> > > at java.lang.Class.forName0(Native Method)
> > >
> > > at java.lang.Class.forName(Class.java:264)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.getClient(
> Storage.scala:228)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.org$apache$
> predictionio$data$storage$Storage$$updateS2CM(Storage.scala:254)
> > >
> > > at org.apache.predictionio.data.storage.Storage$$anonfun$
> sourcesToClientMeta$1.apply(Storage.scala:215)
> > >
> > > at org.apache.predictionio.data.storage.Storage$$anonfun$
> sourcesToClientMeta$1.apply(Storage.scala:215)
> > >
> > > at scala.collection.mutable.MapLike$class.getOrElseUpdate(
> MapLike.scala:189)
> > >
> > > at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.
> sourcesToClientMeta(Storage.scala:215)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.
> getDataObject(Storage.scala:284)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.
> getDataObjectFromRepo(Storage.scala:269)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.
> getMetaDataApps(Storage.scala:387)
> > >
> > > at org.apache.predictionio.data.store.Common$.appsDb$
> lzycompute(Common.scala:27)
> > >
> > > at org.apache.predictionio.data.store.Common$.appsDb(Common.scala:27)
> > >
> > > at org.apache.predictionio.data.store.Common$.appNameToId(
> Common.scala:32)
> > >
> > > at org.apache.predictionio.data.store.PEventStore$.
> aggregateProperties(PEventStore.scala:108)
> > >
> > > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>
> (<console>:31)
> > >
> > > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<
> console>:36)
> > >
> > > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
> > >
> > > at $line20.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
> > >
> > > at $line20.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
> > >
> > > at $line20.$read$$iwC$$iwC$$iwC.<init>(<console>:44)
> > >
> > > at $line20.$read$$iwC$$iwC.<init>(<console>:46)
> > >
> > > at $line20.$read$$iwC.<init>(<console>:48)
> > >
> > > at $line20.$read.<init>(<console>:50)
> > >
> > > at $line20.$read$.<init>(<console>:54)
> > >
> > > at $line20.$read$.<clinit>(<console>)
> > >
> > > at $line20.$eval$.<init>(<console>:7)
> > >
> > > at $line20.$eval$.<clinit>(<console>)
> > >
> > > at $line20.$eval.$print(<console>)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> > >
> > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> > >
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > >
> > > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(
> SparkIMain.scala:1065)
> > >
> > > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(
> SparkIMain.scala:1346)
> > >
> > > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(
> SparkIMain.scala:840)
> > >
> > > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> > >
> > > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> > >
> > > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(
> SparkILoop.scala:857)
> > >
> > > at org.apache.spark.repl.SparkILoop.interpretStartingWith(
> SparkILoop.scala:902)
> > >
> > > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> > >
> > > at org.apache.spark.repl.SparkILoop.processLine$1(
> SparkILoop.scala:657)
> > >
> > > at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
> > >
> > > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$loop(SparkILoop.scala:670)
> > >
> > > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
> > >
> > > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> > >
> > > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> > >
> > > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(
> ScalaClassLoader.scala:135)
> > >
> > > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$process(SparkILoop.scala:945)
> > >
> > > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> > >
> > > at org.apache.spark.repl.Main$.main(Main.scala:31)
> > >
> > > at org.apache.spark.repl.Main.main(Main.scala)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> > >
> > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> > >
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(
> SparkSubmit.scala:181)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> > >
> > > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> > >
> > > org.apache.predictionio.data.storage.StorageClientException: Data
> source PGSQL was not properly initialized.
> > >
> > > at org.apache.predictionio.data.storage.Storage$$anonfun$10.
> apply(Storage.scala:285)
> > >
> > > at org.apache.predictionio.data.storage.Storage$$anonfun$10.
> apply(Storage.scala:285)
> > >
> > > at scala.Option.getOrElse(Option.scala:120)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.
> getDataObject(Storage.scala:284)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.
> getDataObjectFromRepo(Storage.scala:269)
> > >
> > > at org.apache.predictionio.data.storage.Storage$.
> getMetaDataApps(Storage.scala:387)
> > >
> > > at org.apache.predictionio.data.store.Common$.appsDb$
> lzycompute(Common.scala:27)
> > >
> > > at org.apache.predictionio.data.store.Common$.appsDb(Common.scala:27)
> > >
> > > at org.apache.predictionio.data.store.Common$.appNameToId(
> Common.scala:32)
> > >
> > > at org.apache.predictionio.data.store.PEventStore$.
> aggregateProperties(PEventStore.scala:108)
> > >
> > > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
> > >
> > > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
> > >
> > > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
> > >
> > > at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
> > >
> > > at $iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
> > >
> > > at $iwC$$iwC$$iwC.<init>(<console>:44)
> > >
> > > at $iwC$$iwC.<init>(<console>:46)
> > >
> > > at $iwC.<init>(<console>:48)
> > >
> > > at <init>(<console>:50)
> > >
> > > at .<init>(<console>:54)
> > >
> > > at .<clinit>(<console>)
> > >
> > > at .<init>(<console>:7)
> > >
> > > at .<clinit>(<console>)
> > >
> > > at $print(<console>)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> > >
> > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> > >
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > >
> > > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(
> SparkIMain.scala:1065)
> > >
> > > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(
> SparkIMain.scala:1346)
> > >
> > > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(
> SparkIMain.scala:840)
> > >
> > > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> > >
> > > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> > >
> > > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(
> SparkILoop.scala:857)
> > >
> > > at org.apache.spark.repl.SparkILoop.interpretStartingWith(
> SparkILoop.scala:902)
> > >
> > > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> > >
> > > at org.apache.spark.repl.SparkILoop.processLine$1(
> SparkILoop.scala:657)
> > >
> > > at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
> > >
> > > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$loop(SparkILoop.scala:670)
> > >
> > > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
> > >
> > > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> > >
> > > at org.apache.spark.repl.SparkILoop$$anonfun$org$
> apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> > >
> > > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(
> ScalaClassLoader.scala:135)
> > >
> > > at org.apache.spark.repl.SparkILoop.org$apache$spark$
> repl$SparkILoop$$process(SparkILoop.scala:945)
> > >
> > > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> > >
> > > at org.apache.spark.repl.Main$.main(Main.scala:31)
> > >
> > > at org.apache.spark.repl.Main.main(Main.scala)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> > >
> > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> > >
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(
> SparkSubmit.scala:181)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> > >
> > > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> > >
> > > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> > >
> > > Shane Johnson | 801.360.3350
> > >
> > > LinkedIn | Facebook
> >
> >
> >
> >
>
>

Mime
View raw message