Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id A0FDC11285 for ; Fri, 20 Jun 2014 00:48:29 +0000 (UTC) Received: (qmail 28248 invoked by uid 500); 20 Jun 2014 00:48:28 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 28177 invoked by uid 500); 20 Jun 2014 00:48:28 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 28163 invoked by uid 99); 20 Jun 2014 00:48:27 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 20 Jun 2014 00:48:27 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of yuzhihong@gmail.com designates 209.85.213.43 as permitted sender) Received: from [209.85.213.43] (HELO mail-yh0-f43.google.com) (209.85.213.43) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 20 Jun 2014 00:48:22 +0000 Received: by mail-yh0-f43.google.com with SMTP id a41so2332418yho.16 for ; Thu, 19 Jun 2014 17:48:01 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=PqZC/BLBgLqt5oz60JNkca1/IaoYqYh9MY/s6mLJJ7I=; b=n0Y/0cfomjTxLAnT1Dij0stWkfajWjYTmu3SwPneJVml+C+0BvA89S72OlwaDwlBXL xfIfdB81OrFcK1zLcfNLsGqZ9FI08suOWcJxUKocjAoBYKmSy+wjWU3Lt2vy4ERq+/nm 63Ft13plO5yggr9/vF9jxtqfanrfHeQ+3rmN6YrN9HJzFi40SF4K9vZN6fuCpAh9r3Gu bNU5exyD0k5DXIlh2fMl0ieNfHqwq9W+SY+tJcO0IVM0IJrbk5TgUpUH/c0h5+9/MCKh V3LXD/aH+2Hgs0lr4LCBrN0xn5cDp3IatJbYcWx/PB/6riVNWbMjiAENOrgmKSO0MJkj 2QOA== MIME-Version: 1.0 X-Received: by 10.236.117.169 with SMTP id j29mr71509yhh.118.1403225281632; Thu, 19 Jun 2014 17:48:01 -0700 (PDT) Received: by 10.170.55.137 with HTTP; Thu, 19 Jun 2014 17:48:01 -0700 (PDT) In-Reply-To: References: Date: Thu, 19 Jun 2014 17:48:01 -0700 Message-ID: Subject: Re: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.compressionByName From: Ted Yu To: "user@hbase.apache.org" Content-Type: multipart/alternative; boundary=20cf3011d9fb739e1d04fc39d892 X-Virus-Checked: Checked by ClamAV on apache.org --20cf3011d9fb739e1d04fc39d892 Content-Type: text/plain; charset=UTF-8 AbstractHFileWriter is in hbase-server module. Adding hbase-it works because hbase-it depends on hbase-server. The following comes from dependency tree output in trunk (for 0.96 it's the same): [INFO] org.apache.hbase:hbase-it:jar:0.99.0-SNAPSHOT ... [INFO] +- org.apache.hbase:hbase-server:jar:0.99.0-SNAPSHOT:compile On Thu, Jun 19, 2014 at 5:38 PM, Chen Wang wrote: > Figured out. Need > > > > org.apache.hbase > > hbase-it > > 0.96.1.1-cdh5.0.1 > > > > as well.. > > > On Thu, Jun 19, 2014 at 5:21 PM, Chen Wang > wrote: > > > Hi folks, > > > > I am running bulk load with HFileOutputFormat. The reducer throws the > following NoSuchMethodError.Just wondering where this class is? > > > > My pom looks like this:(0.96.1.1-cdh5.0.1) > > > > > > > > org.apache.hadoop > > > > hadoop-client > > > > 2.3.0-mr1-cdh5.0.1 > > > > > > > > > > > > org.apache.hadoop > > > > hadoop-core > > > > 2.3.0-mr1-cdh5.0.1 > > > > > > > > > > > > > > > > org.apache.hbase > > > > hbase > > > > 0.96.1.1-cdh5.0.1 > > > > pom > > > > > > > > > > > > org.apache.hbase > > > > hbase-common > > > > 0.96.1.1-cdh5.0.1 > > > > > > > > > > > > org.apache.httpcomponents > > > > httpclient > > > > 4.1.1 > > > > > > > > > > > > 2014-06-19 17:09:52,496 FATAL [main] org.apache.hadoop.mapred.YarnChild: > Error running child : java.lang.NoSuchMethodError: > org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.compressionByName(Ljava/lang/String;)Lorg/apache/hadoop/hbase/io/compress/Compression$Algorithm; > > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:220) > > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:174) > > at > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:133) > > at > org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:558) > > at > org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) > > at > org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:105) > > at > org.apache.hadoop.hbase.mapreduce.PutSortReducer.reduce(PutSortReducer.java:72) > > at > org.apache.hadoop.hbase.mapreduce.PutSortReducer.reduce(PutSortReducer.java:40) > > at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171) > > at > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627) > > at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389) > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:415) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) > > > > Thanks! > > > > Chen > > > > > --20cf3011d9fb739e1d04fc39d892--