predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Donald Szeto <don...@apache.org>
Subject Re: Issue with loading dependencies and jars
Date Thu, 15 Mar 2018 22:16:01 GMT
Hi Shane,

Although not highly recommended, would you mind trying to set
MYSQL_JDBC_DRIVER in your conf/pio-env.sh to point to the aws-java-sdk.jar
and try running again? In code, third party JARs for MySQL and PostgreSQL
will always be appended at the very front for the "spark-submit --jars"
argument.

If that works for you, we should file a new feature request for this.

Regards,
Donald

On Sun, Mar 11, 2018 at 11:03 AM, Mars Hall <mars.hall@salesforce.com>
wrote:

> On Sat, Mar 10, 2018 at 7:49 PM, Shane Johnson <shane@liftiq.com> wrote:
>
>> Mars, I was reviewing the code that you are referencing, the "jars for
>> Spark" function, this morning and trying to see how it ties in. This code
>> that is outside the custom binary distribution correct, I could not find it
>> in the distribution that is being used by the buildpack?
>>
>
> The PredictionIO binary distribution contains only the "compiled" code,
> its assembly jars. You'll need to dig into PredictionIO's source code (on
> Github) to find that function.
>
>
>
>> Do you think the ordering of jars may need to happen in the Common.scala
>> file instead of the compute-classpath.sh?
>>
>
> Yes, to effectively revise Common.scala, changes must be made to the
> source code (e.g. checked-out from Github develop branch) and then built
> into new custom binary distribution via `make-distribution.sh` script:
>
>   http://predictionio.apache.org/community/contribute-code/
> #getting-started
>
>
> --
> *Mars Hall
> 415-818-7039 <(415)%20818-7039>
> Customer Facing Architect
> Salesforce Platform / Heroku
> San Francisco, California
>

Mime
View raw message