spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bernardo Vecchia Stein <bernardovst...@gmail.com>
Subject Running in cluster mode causes native library linking to fail
Date Wed, 14 Oct 2015 04:44:51 GMT
Hello,

I am trying to run some scala code in cluster mode using spark-submit. This
code uses addLibrary to link with a .so that exists in the machine, and
this library has a function to be called natively (there's a native
definition as needed in the code).

The problem I'm facing is: whenever I try to run this code in cluster mode,
spark fails with the following message when trying to execute the native
function:
java.lang.UnsatisfiedLinkError:
org.name.othername.ClassName.nativeMethod([B[B)[B

Apparently, the library is being found by spark, but the required function
isn't found.

When trying to run in client mode, however, this doesn't fail and
everything works as expected.

Does anybody have any idea of what might be the problem here? Is there any
bug that could be related to this when running in cluster mode?

I appreciate any help.
Thanks,
Bernardo

Mime
View raw message