spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
Subject spark git commit: [SPARK-7756] CORE RDDOperationScope fix for IBM Java
Date Wed, 10 Jun 2015 20:22:03 GMT
Repository: spark
Updated Branches:
  refs/heads/branch-1.4 28e8a6ea6 -> 568d1d51d

[SPARK-7756] CORE RDDOperationScope fix for IBM Java

IBM Java has an extra method when we do getStackTrace(): this is "getStackTraceImpl", a native
method. This causes two tests to fail within "DStreamScopeSuite" when running with IBM Java.
Instead of "map" or "filter" being the method names found, "getStackTrace" is returned. This
commit addresses such an issue by using dropWhile. Given that our current method is withScope,
we look for the next method that isn't ours: we don't care about methods that come before
us in the stack trace: e.g. getStackTrace (regardless of how many levels this might go).

java.lang.Thread.getStackTraceImpl(Native Method)


I've tested this with Oracle and IBM Java, no side effects for other tests introduced.

Author: Adam Roberts <>
Author: a-roberts <>

Closes #6740 from a-roberts/RDDScopeStackCrawlFix and squashes the following commits:

13ce390 [Adam Roberts] Ensure consistency with String equality checking
a4fc0e0 [a-roberts] Update RDDOperationScope.scala

(cherry picked from commit 19e30b48f3c6d0b72871d3e15b9564c1b2822700)
Signed-off-by: Andrew Or <>


Branch: refs/heads/branch-1.4
Commit: 568d1d51d695bea4389f4470cd98707f3049885a
Parents: 28e8a6e
Author: Adam Roberts <>
Authored: Wed Jun 10 13:21:01 2015 -0700
Committer: Andrew Or <>
Committed: Wed Jun 10 13:21:59 2015 -0700

 .../main/scala/org/apache/spark/rdd/RDDOperationScope.scala   | 7 +++----
 1 file changed, 3 insertions(+), 4 deletions(-)
diff --git a/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala b/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala
index 6b09dfa..4466728 100644
--- a/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala
+++ b/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala
@@ -95,10 +95,9 @@ private[spark] object RDDOperationScope extends Logging {
   private[spark] def withScope[T](
       sc: SparkContext,
       allowNesting: Boolean = false)(body: => T): T = {
-    val stackTrace = Thread.currentThread.getStackTrace().tail // ignore "Thread#getStackTrace"
-    val ourMethodName = stackTrace(1).getMethodName // i.e. withScope
-    // Climb upwards to find the first method that's called something different
-    val callerMethodName = stackTrace
+    val ourMethodName = "withScope"
+    val callerMethodName = Thread.currentThread.getStackTrace()
+      .dropWhile(_.getMethodName != ourMethodName)
       .find(_.getMethodName != ourMethodName)
       .getOrElse {

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message