hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Busbey <bus...@cloudera.com>
Subject Re: No applicable class implementing Serialization in conf at io.serializations: class org.apache.hadoop.hbase.client.Put
Date Sun, 02 Nov 2014 21:41:28 GMT
In the 0.94.x API, Put implemented Writable[1]. This meant that MR code,
like yours, could use it as a Key or Value between Mapper and Reducer.

In 0.96 and later APIs, Put no longer directly implements Writable[2].
Instead, HBase now includes a Hadoop Seriazliation implementation.
Normally, this would be configured via the TableMapReduceUtil class for
either a TableMapper or TableReducer.

Presuming that the intention of your MR job is to have all the Puts write
to some HBase table, you should be able to follow the "write to HBase" part
of the examples for reading and writing HBase via mapreduce in the
reference guide[3].

Specifically, you should have your Driver call one of the
initTableReducerJob methods on TableMapReduceUtil, where it currently sets
the Mapper class for your application[4].

-Sean

[1]:
http://hbase.apache.org/0.94/apidocs/org/apache/hadoop/hbase/client/Put.html
[2]: http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/client/Put.html
[3]: http://hbase.apache.org/book/mapreduce.example.html
[4]:
http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html


On Sun, Nov 2, 2014 at 3:02 PM, Serega Sheypak <serega.sheypak@gmail.com>
wrote:

> Hi, I'm migrating from CDH4 to CDH5 (hbase 0.98.6-cdh5.2.0)
> I had a unit test for mapper used to create HFile and bulk load later.
>
> I've bumped maven deps from cdh4 to cdh5 0.98.6-cdh5.2.0
> Now I've started to get exception
>
> java.lang.IllegalStateException: No applicable class implementing
> Serialization in conf at io.serializations: class
> org.apache.hadoop.hbase.client.Put
> at com.google.common.base.Preconditions.checkState(Preconditions.java:149)
> at
>
> org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:75)
> at
>
> org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:97)
> at
>
> org.apache.hadoop.mrunit.internal.output.MockOutputCollector.collect(MockOutputCollector.java:48)
> at
>
> org.apache.hadoop.mrunit.internal.mapreduce.AbstractMockContextWrapper$4.answer(AbstractMockContextWrapper.java:90)
> at
>
> org.mockito.internal.stubbing.StubbedInvocationMatcher.answer(StubbedInvocationMatcher.java:34)
> at
>
> org.mockito.internal.handler.MockHandlerImpl.handle(MockHandlerImpl.java:91)
> at
>
> org.mockito.internal.handler.NullResultGuardian.handle(NullResultGuardian.java:29)
> at
>
> org.mockito.internal.handler.InvocationNotifierHandler.handle(InvocationNotifierHandler.java:38)
> at
>
> org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:51)
> at
>
> org.apache.hadoop.mapreduce.Mapper$Context$$EnhancerByMockitoWithCGLIB$$ba4633fb.write(<generated>)
>
>
> And here is mapper code:
>
>
>
> public class ItemRecommendationHBaseMapper extends Mapper<LongWritable,
> BytesWritable, ImmutableBytesWritable, Put> {
>
>     private final ImmutableBytesWritable hbaseKey = new
> ImmutableBytesWritable();
>     private final DynamicObjectSerDe<ItemRecommendation> serde = new
> DynamicObjectSerDe<ItemRecommendation>(ItemRecommendation.class);
>
>     @Override
>     protected void map(LongWritable key, BytesWritable value, Context
> context) throws IOException, InterruptedException {
>         checkPreconditions(key, value);
>         hbaseKey.set(Bytes.toBytes(key.get()));
>
>         ItemRecommendation item = serde.deserialize(value.getBytes());
>         checkPreconditions(item);
>         Put put = PutFactory.createPut(serde, item, getColumnFamily());
>
>         context.write(hbaseKey, put); //Exception here
>     }
>
> Whatcan i do in order to make unit-test pass?
>



-- 
Sean

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message