Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2839D100DF for ; Sun, 2 Nov 2014 22:09:37 +0000 (UTC) Received: (qmail 10310 invoked by uid 500); 2 Nov 2014 22:09:35 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 10239 invoked by uid 500); 2 Nov 2014 22:09:35 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 10225 invoked by uid 99); 2 Nov 2014 22:09:34 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 02 Nov 2014 22:09:34 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of busbey@cloudera.com designates 209.85.192.42 as permitted sender) Received: from [209.85.192.42] (HELO mail-qg0-f42.google.com) (209.85.192.42) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 02 Nov 2014 22:09:30 +0000 Received: by mail-qg0-f42.google.com with SMTP id i50so7071146qgf.15 for ; Sun, 02 Nov 2014 14:08:24 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:content-type; bh=HIUGRl0qo9/WPg2KcofFat9R+IH4z9Bk+WvKka5hd7c=; b=kIkYxy3yOpPtB5ivGWoUzZt0eSZQU2gY7DqbajcXwcYO6crG7dPZylau9TKuk5FMS4 A07uEdorpO/XTMk9fpRMWR6qL/6E0E1qTiXfm3Fe5WdYIqmVj0qS/QO1NGheF0QaAjt2 GVRvjPiDX5zTtKle0m6lsaUGHzxfZS+hXAV2lAR5/JtACdibSjcfid7HW/lavUeb46Wz wQ+uxbBNMs3HJl5xxJNkEnWriV83BetZHERXuIlz28oBChzFMR8MHN5u54bpW/Rm5a2M rok7dSH2yAKycSh5j9AA/puy4b6+eu5YJ/hrKng1xyKHRwaeibAGHttC8lbftiH3PAr0 okog== X-Gm-Message-State: ALoCoQnnIl41myeKBVjoKkdT0jMLvPb2QujJjfnE73F2rTDtEC0nER1uzkjRmfbDLxQMw1TPJhDZ X-Received: by 10.224.36.200 with SMTP id u8mr57242667qad.47.1414966104607; Sun, 02 Nov 2014 14:08:24 -0800 (PST) MIME-Version: 1.0 Received: by 10.229.104.136 with HTTP; Sun, 2 Nov 2014 14:08:04 -0800 (PST) In-Reply-To: References: From: Sean Busbey Date: Sun, 2 Nov 2014 16:08:04 -0600 Message-ID: Subject: Re: No applicable class implementing Serialization in conf at io.serializations: class org.apache.hadoop.hbase.client.Put To: user Content-Type: multipart/alternative; boundary=001a11c1bc0008e8570506e77888 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c1bc0008e8570506e77888 Content-Type: text/plain; charset=UTF-8 If you're calling HFileOutputFormat.configureIncrementalLoad, that should be setting up the Serialization for you. Can you look at the job configuration and see what's present for the key "io.serializations"? -Sean On Sun, Nov 2, 2014 at 3:53 PM, Serega Sheypak wrote: > I use it to prepare HFile using my custom mapper emitting Put and > HFileOutputFormat.configureIncrementalLoad(job, createHTable()) > //connection to target table > > and then bulk load data to table using LoadIncrementalHFiles > > P.S. > HFileOutputFormat is also deprecated... so many changes... ((( > > > 2014-11-03 0:41 GMT+03:00 Sean Busbey : > > > In the 0.94.x API, Put implemented Writable[1]. This meant that MR code, > > like yours, could use it as a Key or Value between Mapper and Reducer. > > > > In 0.96 and later APIs, Put no longer directly implements Writable[2]. > > Instead, HBase now includes a Hadoop Seriazliation implementation. > > Normally, this would be configured via the TableMapReduceUtil class for > > either a TableMapper or TableReducer. > > > > Presuming that the intention of your MR job is to have all the Puts write > > to some HBase table, you should be able to follow the "write to HBase" > part > > of the examples for reading and writing HBase via mapreduce in the > > reference guide[3]. > > > > Specifically, you should have your Driver call one of the > > initTableReducerJob methods on TableMapReduceUtil, where it currently > sets > > the Mapper class for your application[4]. > > > > -Sean > > > > [1]: > > > > > http://hbase.apache.org/0.94/apidocs/org/apache/hadoop/hbase/client/Put.html > > [2]: > > http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/client/Put.html > > [3]: http://hbase.apache.org/book/mapreduce.example.html > > [4]: > > > > > http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableMapReduceUtil.html > > > > > > On Sun, Nov 2, 2014 at 3:02 PM, Serega Sheypak > > > wrote: > > > > > Hi, I'm migrating from CDH4 to CDH5 (hbase 0.98.6-cdh5.2.0) > > > I had a unit test for mapper used to create HFile and bulk load later. > > > > > > I've bumped maven deps from cdh4 to cdh5 0.98.6-cdh5.2.0 > > > Now I've started to get exception > > > > > > java.lang.IllegalStateException: No applicable class implementing > > > Serialization in conf at io.serializations: class > > > org.apache.hadoop.hbase.client.Put > > > at > > com.google.common.base.Preconditions.checkState(Preconditions.java:149) > > > at > > > > > > > > > org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:75) > > > at > > > > > > > > > org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:97) > > > at > > > > > > > > > org.apache.hadoop.mrunit.internal.output.MockOutputCollector.collect(MockOutputCollector.java:48) > > > at > > > > > > > > > org.apache.hadoop.mrunit.internal.mapreduce.AbstractMockContextWrapper$4.answer(AbstractMockContextWrapper.java:90) > > > at > > > > > > > > > org.mockito.internal.stubbing.StubbedInvocationMatcher.answer(StubbedInvocationMatcher.java:34) > > > at > > > > > > > > > org.mockito.internal.handler.MockHandlerImpl.handle(MockHandlerImpl.java:91) > > > at > > > > > > > > > org.mockito.internal.handler.NullResultGuardian.handle(NullResultGuardian.java:29) > > > at > > > > > > > > > org.mockito.internal.handler.InvocationNotifierHandler.handle(InvocationNotifierHandler.java:38) > > > at > > > > > > > > > org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:51) > > > at > > > > > > > > > org.apache.hadoop.mapreduce.Mapper$Context$$EnhancerByMockitoWithCGLIB$$ba4633fb.write() > > > > > > > > > And here is mapper code: > > > > > > > > > > > > public class ItemRecommendationHBaseMapper extends Mapper > > BytesWritable, ImmutableBytesWritable, Put> { > > > > > > private final ImmutableBytesWritable hbaseKey = new > > > ImmutableBytesWritable(); > > > private final DynamicObjectSerDe serde = new > > > DynamicObjectSerDe(ItemRecommendation.class); > > > > > > @Override > > > protected void map(LongWritable key, BytesWritable value, Context > > > context) throws IOException, InterruptedException { > > > checkPreconditions(key, value); > > > hbaseKey.set(Bytes.toBytes(key.get())); > > > > > > ItemRecommendation item = serde.deserialize(value.getBytes()); > > > checkPreconditions(item); > > > Put put = PutFactory.createPut(serde, item, getColumnFamily()); > > > > > > context.write(hbaseKey, put); //Exception here > > > } > > > > > > Whatcan i do in order to make unit-test pass? > > > > > > > > > > > -- > > Sean > > > -- Sean --001a11c1bc0008e8570506e77888--