predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Donald Szeto <>
Subject Re: Issue with loading dependencies and jars
Date Thu, 15 Mar 2018 22:16:01 GMT
Hi Shane,

Although not highly recommended, would you mind trying to set
MYSQL_JDBC_DRIVER in your conf/ to point to the aws-java-sdk.jar
and try running again? In code, third party JARs for MySQL and PostgreSQL
will always be appended at the very front for the "spark-submit --jars"

If that works for you, we should file a new feature request for this.


On Sun, Mar 11, 2018 at 11:03 AM, Mars Hall <>

> On Sat, Mar 10, 2018 at 7:49 PM, Shane Johnson <> wrote:
>> Mars, I was reviewing the code that you are referencing, the "jars for
>> Spark" function, this morning and trying to see how it ties in. This code
>> that is outside the custom binary distribution correct, I could not find it
>> in the distribution that is being used by the buildpack?
> The PredictionIO binary distribution contains only the "compiled" code,
> its assembly jars. You'll need to dig into PredictionIO's source code (on
> Github) to find that function.
>> Do you think the ordering of jars may need to happen in the Common.scala
>> file instead of the
> Yes, to effectively revise Common.scala, changes must be made to the
> source code (e.g. checked-out from Github develop branch) and then built
> into new custom binary distribution via `` script:
> #getting-started
> --
> *Mars Hall
> 415-818-7039 <(415)%20818-7039>
> Customer Facing Architect
> Salesforce Platform / Heroku
> San Francisco, California

View raw message