phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hudson (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (PHOENIX-3540) Fix Time data type in Phoenix Spark integration
Date Thu, 22 Dec 2016 09:07:58 GMT

    [ https://issues.apache.org/jira/browse/PHOENIX-3540?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15769553#comment-15769553
] 

Hudson commented on PHOENIX-3540:
---------------------------------

SUCCESS: Integrated in Jenkins build Phoenix-master #1521 (See [https://builds.apache.org/job/Phoenix-master/1521/])
PHOENIX-3540 Fix Time data type in Phoenix Spark integration (ankitsinghal59: rev bd2acd5404ce03fb330a72bbf346546b7f4fbd2b)
* (edit) phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala
* (edit) phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRDD.scala
* (edit) phoenix-spark/src/it/resources/globalSetup.sql


> Fix Time data type in Phoenix Spark integration
> -----------------------------------------------
>
>                 Key: PHOENIX-3540
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-3540
>             Project: Phoenix
>          Issue Type: Bug
>            Reporter: Sergio Peleato
>            Assignee: Ankit Singhal
>             Fix For: 4.10.0
>
>         Attachments: PHOENIX-3540.patch
>
>
> {code}
> 2016-12-13 07:56:07,773|INFO|MainThread|machine.py:145 - run()|2016-12-13 07:56:07,773
DEBUG [main] repl.SparkILoop$SparkILoopInterpreter: Invoking: public static java.lang.String
$line20.$eval.$print()
> 2016-12-13 07:56:07,805|INFO|MainThread|machine.py:145 - run()|org.apache.spark.SparkException:
Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 0.0 (TID 3, ctr-e77-1481596162056-0246-01-000003.hwx.site): java.lang.ClassCastException:
java.sql.Time cannot be cast to java.sql.Timestamp
> 2016-12-13 07:56:07,805|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.catalyst.CatalystTypeConverters$TimestampConverter$.toCatalystImpl(CatalystTypeConverters.scala:313)
> 2016-12-13 07:56:07,805|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.catalyst.CatalystTypeConverters$CatalystTypeConverter.toCatalyst(CatalystTypeConverters.scala:102)
> 2016-12-13 07:56:07,805|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.catalyst.CatalystTypeConverters$StructConverter.toCatalystImpl(CatalystTypeConverters.scala:260)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.catalyst.CatalystTypeConverters$StructConverter.toCatalystImpl(CatalystTypeConverters.scala:250)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.catalyst.CatalystTypeConverters$CatalystTypeConverter.toCatalyst(CatalystTypeConverters.scala:102)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.catalyst.CatalystTypeConverters$$anonfun$createToCatalystConverter$2.apply(CatalystTypeConverters.scala:401)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.SQLContext$$anonfun$6.apply(SQLContext.scala:492)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.sql.SQLContext$$anonfun$6.apply(SQLContext.scala:492)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> 2016-12-13 07:56:07,806|INFO|MainThread|machine.py:145 - run()|at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> 2016-12-13 07:56:07,807|INFO|MainThread|machine.py:145 - run()|at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> 2016-12-13 07:56:07,807|INFO|MainThread|machine.py:145 - run()|at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> 2016-12-13 07:56:07,807|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1112)
> 2016-12-13 07:56:07,807|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1111)
> 2016-12-13 07:56:07,807|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1111)
> 2016-12-13 07:56:07,807|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1277)
> 2016-12-13 07:56:07,807|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1119)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1091)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.scheduler.Task.run(Task.scala:89)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|at java.lang.Thread.run(Thread.java:745)
> 2016-12-13 07:56:07,808|INFO|MainThread|machine.py:145 - run()|
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message