predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Naoki Takezoe <take...@gmail.com>
Subject Re: PostgreSQL JDBC driver configuration?
Date Thu, 21 Sep 2017 14:40:16 GMT
You have to put the jar file of PostgreSQL JDBC driver in PIO_HOME/lib by
hand.

2017/09/21 19:52 "Noelia Osés Fernández" <noses@vicomtech.org>:

> Hi all,
>
> I'm having trouble because I can't seem to get PIO to work with PostgreSQL.
> I've seen the following report:
>
>
>
>
> *https://issues.apache.org/jira/browse/PIO-49
> <https://issues.apache.org/jira/browse/PIO-49>PIO-49: PostgreSQL JDBC
> driver is no longer bundled with the core assembly. If you are using
> PostgreSQL, you must download the JDBC driver and update your configuration
> to point to the correct JDBC driver file.*
> However, it doesn't say how to update the configuration. How is this done?
>
> When I do 'pio status' I get the following message:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding
> in
> [jar:file:/home/VICOMTECH/noses/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
> Found binding in
> [jar:file:/home/VICOMTECH/noses/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
> See http://www.slf4j.org/codes.html#multiple_bindings
> <http://www.slf4j.org/codes.html#multiple_bindings> for an
> explanation.SLF4J: Actual binding is of type
> [org.slf4j.impl.Log4jLoggerFactory][INFO] [Management$] Inspecting
> PredictionIO...[INFO] [Management$] PredictionIO 0.11.0-incubating is
> installed at
> /home/VICOMTECH/noses/PredictionIO/apache-predictionio-0.11.0-incubating/PredictionIO-0.11.0-incubating[INFO]
> [Management$] Inspecting Apache Spark...[INFO] [Management$] Apache Spark
> is installed at /home/VICOMTECH/noses/spark-2.2.0-bin-hadoop2.7[INFO]
> [Management$] Apache Spark 2.2.0 detected (meets minimum requirement of
> 1.3.0)[INFO] [Management$] Inspecting storage backend connections...[INFO]
> [Storage$] Verifying Meta Data Backend (Source: PGSQL)...[ERROR]
> [Management$] Unable to connect to all storage backends successfully.The
> following shows the error message from the storage backend.No suitable
> driver found for jdbc:postgresql://localhost/pio
> (java.sql.SQLException)Dumping configuration of initialized storage backend
> sources.Please make sure they are correct.Source Name: PGSQL; Type: jdbc;
> Configuration: URL -> jdbc:postgresql://localhost/pio, PASSWORD -> pio,
> TYPE -> jdbc, USERNAME -> pio*
>
> Any help pointing towards the correct configuration of the JDBC driver for
> PostgreSQL to work with PIO will be much appreciated! (I have alreade
> downloaded postgresql-42.1.4.jar)
>
> Thank you very much!
> Noelia
>

Mime
View raw message