spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deenar Toraskar <>
Subject Re: Running in cluster mode causes native library linking to fail
Date Wed, 14 Oct 2015 05:50:01 GMT
Hi Bernardo

Is the native library installed on all machines of your cluster and are you
setting both the spark.driver.extraLibraryPath and
spark.executor.extraLibraryPath ?


On 14 October 2015 at 05:44, Bernardo Vecchia Stein <> wrote:

> Hello,
> I am trying to run some scala code in cluster mode using spark-submit.
> This code uses addLibrary to link with a .so that exists in the machine,
> and this library has a function to be called natively (there's a native
> definition as needed in the code).
> The problem I'm facing is: whenever I try to run this code in cluster
> mode, spark fails with the following message when trying to execute the
> native function:
> java.lang.UnsatisfiedLinkError:
> Apparently, the library is being found by spark, but the required function
> isn't found.
> When trying to run in client mode, however, this doesn't fail and
> everything works as expected.
> Does anybody have any idea of what might be the problem here? Is there any
> bug that could be related to this when running in cluster mode?
> I appreciate any help.
> Thanks,
> Bernardo

View raw message