spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-10798) JsonMappingException with Spark Context Parallelize
Date Fri, 02 Oct 2015 21:34:27 GMT

    [ https://issues.apache.org/jira/browse/SPARK-10798?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14941804#comment-14941804
] 

Sean Owen commented on SPARK-10798:
-----------------------------------

That's Java code, and you're pasting it in the scala shell.

> JsonMappingException with Spark Context Parallelize
> ---------------------------------------------------
>
>                 Key: SPARK-10798
>                 URL: https://issues.apache.org/jira/browse/SPARK-10798
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.0
>         Environment: Linux, Java 1.8.45
>            Reporter: Dev Lakhani
>
> When trying to create an RDD of Rows using a Java Spark Context and if I serialize the
rows with Kryo first, the sparkContext fails.
> byte[] data= Kryo.serialize(List<Row>)
> List<Row> fromKryoRows=Kryo.unserialize(data)
> List<Row> rows= new Vector<Row>(); //using a new set of data.
> rows.add(RowFactory.create("test"));
> javaSparkContext.parallelize(rows);
> OR
> javaSparkContext.parallelize(fromKryoRows); //using deserialized rows
> I get :
> com.fasterxml.jackson.databind.JsonMappingException: (None,None) (of class scala.Tuple2)
(through reference chain: org.apache.spark.rdd.RDDOperationScope["parent"])
>                at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:210)
>                at com.fasterxml.jackson.databind.JsonMappingException.wrapWithPath(JsonMappingException.java:177)
>                at com.fasterxml.jackson.databind.ser.std.StdSerializer.wrapAndThrow(StdSerializer.java:187)
>                at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:647)
>                at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:152)
>                at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:128)
>                at com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:2881)
>                at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:2338)
>                at org.apache.spark.rdd.RDDOperationScope.toJson(RDDOperationScope.scala:50)
>                at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:141)
>                at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>                at org.apache.spark.SparkContext.withScope(SparkContext.scala:700)
>                at org.apache.spark.SparkContext.parallelize(SparkContext.scala:714)
>                at org.apache.spark.api.java.JavaSparkContext.parallelize(JavaSparkContext.scala:145)
>                at org.apache.spark.api.java.JavaSparkContext.parallelize(JavaSparkContext.scala:157)
>                ...
> Caused by: scala.MatchError: (None,None) (of class scala.Tuple2)
>                at com.fasterxml.jackson.module.scala.ser.OptionSerializer$$anonfun$serialize$1.apply$mcV$sp(OptionSerializerModule.scala:32)
>                at com.fasterxml.jackson.module.scala.ser.OptionSerializer$$anonfun$serialize$1.apply(OptionSerializerModule.scala:32)
>                at com.fasterxml.jackson.module.scala.ser.OptionSerializer$$anonfun$serialize$1.apply(OptionSerializerModule.scala:32)
>                at scala.Option.getOrElse(Option.scala:120)
>                at com.fasterxml.jackson.module.scala.ser.OptionSerializer.serialize(OptionSerializerModule.scala:31)
>                at com.fasterxml.jackson.module.scala.ser.OptionSerializer.serialize(OptionSerializerModule.scala:22)
>                at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:505)
>                at com.fasterxml.jackson.module.scala.ser.OptionPropertyWriter.serializeAsField(OptionSerializerModule.scala:128)
>                at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:639)
>                ... 19 more
> I've tried updating jackson module scala to 2.6.1 but same issue. This happens in local
mode with java 1.8_45. I searched the web and this Jira for similar issues but found nothing
of interest.
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message