Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id DC605200CC4 for ; Wed, 7 Jun 2017 23:28:23 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id DAF9C160BED; Wed, 7 Jun 2017 21:28:23 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 07B11160BEE for ; Wed, 7 Jun 2017 23:28:22 +0200 (CEST) Received: (qmail 69994 invoked by uid 500); 7 Jun 2017 21:27:21 -0000 Mailing-List: contact issues-help@geode.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@geode.apache.org Delivered-To: mailing list issues@geode.apache.org Received: (qmail 69985 invoked by uid 99); 7 Jun 2017 21:27:21 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 07 Jun 2017 21:27:21 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id A4A6C1A077B for ; Wed, 7 Jun 2017 21:27:20 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -98.702 X-Spam-Level: X-Spam-Status: No, score=-98.702 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_NUMSUBJECT=0.5, RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id QDAnx9-qocYQ for ; Wed, 7 Jun 2017 21:27:19 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 0E9925F6D2 for ; Wed, 7 Jun 2017 21:27:19 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 7EE08E0AE8 for ; Wed, 7 Jun 2017 21:27:18 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 23D4220DF1 for ; Wed, 7 Jun 2017 21:27:18 +0000 (UTC) Date: Wed, 7 Jun 2017 21:27:18 +0000 (UTC) From: "ASF subversion and git services (JIRA)" To: issues@geode.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (GEODE-194) Geode Spark Connector does not support Spark 2.0 MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Wed, 07 Jun 2017 21:28:24 -0000 [ https://issues.apache.org/jira/browse/GEODE-194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16041648#comment-16041648 ] ASF subversion and git services commented on GEODE-194: ------------------------------------------------------- Commit b27a79ae91943a6ed1426f44dc4709a33eb671eb in geode's branch refs/heads/develop from [~amb] [ https://git-wip-us.apache.org/repos/asf?p=geode.git;h=b27a79a ] GEODE-194: Remove spark connector Remove the spark connector code until it can be updated for the current spark release. We should also integrate the build lifecycle and consider how to extract this into a separate repo. This closes #558 > Geode Spark Connector does not support Spark 2.0 > ------------------------------------------------ > > Key: GEODE-194 > URL: https://issues.apache.org/jira/browse/GEODE-194 > Project: Geode > Issue Type: Bug > Components: extensions > Reporter: Jianxia Chen > Labels: experimental, gsoc2016 > > The BasicIntegrationTest fails when using spark 1.4. e.g. > [info] - GemFire OQL query with more complex UDT: Partitioned Region *** FAILED *** > [info] org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 24.0 failed 1 times, most recent failure: Lost task 0.0 in stage 24.0 (TID 48, localhost): scala.MatchError: > [info] Portfolio [id=3 status=active type=type3 > [info] AOL:Position [secId=AOL qty=978.0 mktValue=40.373], > [info] MSFT:Position [secId=MSFT qty=98327.0 mktValue=23.32]] (of class ittest.io.pivotal.gemfire.spark.connector.Portfolio) > [info] at org.apache.spark.sql.catalyst.CatalystTypeConverters$$anonfun$createToCatalystConverter$4.apply(CatalystTypeConverters.scala:178) > [info] at org.apache.spark.sql.execution.RDDConversions$$anonfun$rowToRowRdd$1$$anonfun$apply$2.apply(ExistingRDD.scala:62) > [info] at org.apache.spark.sql.execution.RDDConversions$$anonfun$rowToRowRdd$1$$anonfun$apply$2.apply(ExistingRDD.scala:59) > [info] at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) > [info] at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) > [info] at scala.collection.Iterator$class.foreach(Iterator.scala:727) > [info] at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) > [info] at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48) > [info] at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103) > [info] at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47) > [info] at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273) > [info] at scala.collection.AbstractIterator.to(Iterator.scala:1157) > [info] at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265) > [info] at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157) > [info] at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252) > [info] at scala.collection.AbstractIterator.toArray(Iterator.scala:1157) > [info] at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) > [info] at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:885) > [info] at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) > [info] at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1765) > [info] at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63) > [info] at org.apache.spark.scheduler.Task.run(Task.scala:70) > [info] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) > [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > [info] at java.lang.Thread.run(Thread.java:745) > [info] > [info] Driver stacktrace: > [info] at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266) > [info] at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257) > [info] at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256) > [info] at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > [info] at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) > [info] at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256) > [info] at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730) > [info] at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730) > [info] at scala.Option.foreach(Option.scala:236) > [info] at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730) > [info] ... -- This message was sent by Atlassian JIRA (v6.3.15#6346)