spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Georg Heiler <georg.kf.hei...@gmail.com>
Subject Re: spark messing up handling of native dependency code?
Date Sat, 03 Jun 2017 06:27:07 GMT
When tested without any parallelism the same problem persists. Actually,
NiFi shows the same issues. So probably it is not related to spark.

Maciej Szymkiewicz <mszymkiewicz@gmail.com> schrieb am Sa., 3. Juni 2017 um
01:37 Uhr:

> Maybe not related, but in general geotools are not thread safe,so using
> from workers is most likely a gamble.
> On 06/03/2017 01:26 AM, Georg Heiler wrote:
>
> Hi,
>
> There is a weird problem with spark when handling native dependency code:
> I want to use a library (JAI) with spark to parse some spatial raster
> files. Unfortunately, there are some strange issues. JAI only works when
> running via the build tool i.e. `sbt run` when executed in spark.
>
> When executed via spark-submit the error is:
>
>     java.lang.IllegalArgumentException: The input argument(s) may not be
> null.
>     at
> javax.media.jai.ParameterBlockJAI.getDefaultMode(ParameterBlockJAI.java:136)
>     at
> javax.media.jai.ParameterBlockJAI.<init>(ParameterBlockJAI.java:157)
>     at
> javax.media.jai.ParameterBlockJAI.<init>(ParameterBlockJAI.java:178)
>     at
> org.geotools.process.raster.PolygonExtractionProcess.execute(PolygonExtractionProcess.java:171)
>
> Which looks like some native dependency (I think GEOS is running in the
> background) is not there correctly.
>
> Assuming something is wrong with the class path I tried to run a plain
> java/scala function. but this one works just fine.
>
> Is spark messing with the class paths?
>
> I created a minimal example here:
> https://github.com/geoHeil/jai-packaging-problem
>
>
> Hope someone can shed some light on this problem,
> Regards,
> Georg
>
>
>

Mime
View raw message