spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pete Robbins <>
Subject Reliance on java.math.BigInteger implementation
Date Fri, 14 Aug 2015 18:27:33 GMT

The code to handle BigInteger types in


is dependant on the implementation of java.math.BigInteger


      try {
        signumOffset =
        magOffset =
      } catch (Exception ex) {
        // should not happen

This is relying on there being fields "int signum" and "int[] mag"

These implementaton fields are not part of the Java specification for this
class so can not be relied upon.

We are running Spark on IBM jdks and their spec-compliant implementation
has different internal fields. This causes an abort when running on these
java runtimes. There is also no guarantee that any future implentations of
OpenJDK will maintain these field names.

I think we need to find an implementation of these Spark functions that
only relies on Java compliant classes rather than specific implementations.

Any thoughts?


View raw message