accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Elser <josh.el...@gmail.com>
Subject Re: Serialization error
Date Tue, 28 Apr 2015 17:43:08 GMT
Hi Madhvi,

Thanks for posting this. I'm not super familiar, but my hunch is that 
Spark requires objects that it works with to implement the Java 
Serializable interface.

Accumulo deals with Key (and Value) through Hadoop's Writable interface 
(technically WritableComparable, but still stems from Writable). I'm not 
sure if there's a way that you can inform Spark to use the Writable 
interface methods instead of Serializable.

If there isn't a way to do this, I don't see any reason why we couldn't 
make Key (and Value) also implement Serializable for this use case. 
Please open an issue on JIRA for this -- we can track the investigation 
there.

Thanks!

- Josh

madhvi wrote:
> Hi,
>
> While connecting to accumulo through spark by making sparkRDD I am
> getting the following error:
> object not serializable (class: org.apache.accumulo.core.data.Key)
>
> This is due to the 'key' class of accumulo which does not implement
> serializable interface.How it can be solved and accumulo can be used
> with spark
>
> Thanks
> Madhvi

Mime
View raw message