From issues-return-194578-archive-asf-public=cust-asf.ponee.io@spark.apache.org Fri Jun 22 06:17:04 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 24CE118062B for ; Fri, 22 Jun 2018 06:17:03 +0200 (CEST) Received: (qmail 50394 invoked by uid 500); 22 Jun 2018 04:17:03 -0000 Mailing-List: contact issues-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@spark.apache.org Received: (qmail 50385 invoked by uid 99); 22 Jun 2018 04:17:03 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 22 Jun 2018 04:17:03 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id B7FF2C1F75 for ; Fri, 22 Jun 2018 04:17:02 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -109.501 X-Spam-Level: X-Spam-Status: No, score=-109.501 tagged_above=-999 required=6.31 tests=[ENV_AND_HDR_SPF_MATCH=-0.5, KAM_ASCII_DIVIDERS=0.8, RCVD_IN_DNSWL_MED=-2.3, SPF_PASS=-0.001, USER_IN_DEF_SPF_WL=-7.5, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id O0uI_OWRwcyv for ; Fri, 22 Jun 2018 04:17:01 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 459B35F35A for ; Fri, 22 Jun 2018 04:17:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 95092E0177 for ; Fri, 22 Jun 2018 04:17:00 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 212522183A for ; Fri, 22 Jun 2018 04:17:00 +0000 (UTC) Date: Fri, 22 Jun 2018 04:17:00 +0000 (UTC) From: "Hyukjin Kwon (JIRA)" To: issues@spark.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (SPARK-24201) IllegalArgumentException originating from ClosureCleaner in Java 9+ MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/SPARK-24201?page=3Dcom.atlassia= n.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D165= 19997#comment-16519997 ]=20 Hyukjin Kwon commented on SPARK-24201: -------------------------------------- JDK 9 is in progress in SPARK-24417 too. Let's leave this one closed until = we complete it. Reopen it if this still exists after we complete that fix. > IllegalArgumentException originating from ClosureCleaner in Java 9+=20 > -------------------------------------------------------------------- > > Key: SPARK-24201 > URL: https://issues.apache.org/jira/browse/SPARK-24201 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.3.0 > Environment: java version "9.0.4" > scala version "2.11.12" > Reporter: Grant Henke > Priority: Major > > Apache Kudu's kudu-spark tests are failing on Java 9.=C2=A0 > I assume Java 9 is supported and this is an unexpected bug given the docs= say "Spark runs on Java 8+" [here|https://spark.apache.org/docs/2.3.0/]. > The stacktrace seen is below: > {code} > java.lang.IllegalArgumentException > at org.apache.xbean.asm5.ClassReader.(Unknown Source) > at org.apache.xbean.asm5.ClassReader.(Unknown Source) > at org.apache.xbean.asm5.ClassReader.(Unknown Source) > at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCl= eaner.scala:46) > at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visit= MethodInsn$2.apply(ClosureCleaner.scala:449) > at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visit= MethodInsn$2.apply(ClosureCleaner.scala:432) > at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1= .apply(TraversableLike.scala:733) > at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.ap= ply(HashMap.scala:134) > at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.ap= ply(HashMap.scala:134) > at scala.collection.mutable.HashTable$class.foreachEntry(HashTabl= e.scala:236) > at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40= ) > at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala= :134) > at scala.collection.TraversableLike$WithFilter.foreach(Traversabl= eLike.scala:732) > at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodIns= n(ClosureCleaner.scala:432) > at org.apache.xbean.asm5.ClassReader.a(Unknown Source) > at org.apache.xbean.asm5.ClassReader.b(Unknown Source) > at org.apache.xbean.asm5.ClassReader.accept(Unknown Source) > at org.apache.xbean.asm5.ClassReader.accept(Unknown Source) > at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark= $util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262) > at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark= $util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261) > at scala.collection.immutable.List.foreach(List.scala:392) > at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$Cl= osureCleaner$$clean(ClosureCleaner.scala:261) > at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.sca= la:159) > at org.apache.spark.SparkContext.clean(SparkContext.scala:2292) > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066) > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2092) > at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:93= 9) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperation= Scope.scala:151) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperation= Scope.scala:112) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) > at org.apache.spark.rdd.RDD.collect(RDD.scala:938) > at org.apache.kudu.spark.kudu.KuduRDDTest$$anonfun$1.apply(KuduRD= DTest.scala:30) > at org.apache.kudu.spark.kudu.KuduRDDTest$$anonfun$1.apply(KuduRD= DTest.scala:27) > at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) > at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) > at org.scalatest.Transformer.apply(Transformer.scala:22) > at org.scalatest.Transformer.apply(Transformer.scala:20) > at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:18= 6) > at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196) > at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560) > at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteL= ike.scala:183) > at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLi= ke.scala:196) > at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLi= ke.scala:196) > at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289) > at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:19= 6) > at org.apache.kudu.spark.kudu.KuduRDDTest.org$scalatest$BeforeAnd= After$$super$runTest(KuduRDDTest.scala:25) > at org.scalatest.BeforeAndAfter$class.runTest(BeforeAndAfter.scal= a:203) > at org.apache.kudu.spark.kudu.KuduRDDTest.runTest(KuduRDDTest.sca= la:25) > at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteL= ike.scala:229) > at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteL= ike.scala:229) > at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(= Engine.scala:396) > at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(= Engine.scala:384) > at scala.collection.immutable.List.foreach(List.scala:392) > at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384) > at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsI= nBranch(Engine.scala:379) > at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461) > at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:2= 29) > at org.scalatest.FunSuite.runTests(FunSuite.scala:1560) > at org.scalatest.Suite$class.run(Suite.scala:1147) > at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(F= unSuite.scala:1560) > at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.s= cala:233) > at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.s= cala:233) > at org.scalatest.SuperEngine.runImpl(Engine.scala:521) > at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233) > at org.apache.kudu.spark.kudu.KuduRDDTest.org$scalatest$BeforeAnd= AfterAll$$super$run(KuduRDDTest.scala:25) > at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndA= fterAll.scala:213) > at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.sc= ala:210) > at org.apache.kudu.spark.kudu.KuduRDDTest.org$scalatest$BeforeAnd= After$$super$run(KuduRDDTest.scala:25) > at org.scalatest.BeforeAndAfter$class.run(BeforeAndAfter.scala:25= 8) > at org.apache.kudu.spark.kudu.KuduRDDTest.run(KuduRDDTest.scala:2= 5) > {code} > It looks like ClassReader's constructor throws an IllegalArgumentExceptio= n if the Java version is greater than 1.8: > {code} > public ClassReader(final byte[] b, final int off, final int len) { > this.b =3D b; > // checks the class version > if (readShort(off + 6) > Opcodes.V1_8) { > throw new IllegalArgumentException(); > } > ... > {code} > It looks like upgrading to org.apache.xbean.asm6 would solve the issue by= supporting up to Java 10: > {code} > if (checkClassVersion && readShort(classFileOffset + 6) > Opcodes.V10) { > throw new IllegalArgumentException( > "Unsupported class file major version " + readShort(classFileOffset= + 6)); > } > {code} > The Apache Kudu test failures can be recreated by cloning the repo and ru= nning the kudu-spark tests: > {code} > git clone https://github.com/apache/kudu.git > cd kudu/java/kudu-spark > ./gradlew test > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org