hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chen Wang <chen.apache.s...@gmail.com>
Subject java.lang.NoSuchMethodError: org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.compressionByName
Date Fri, 20 Jun 2014 00:21:24 GMT
Hi folks,

I am running bulk load with HFileOutputFormat. The reducer throws the
following NoSuchMethodError.Just wondering where this class is?

My pom looks like this:(0.96.1.1-cdh5.0.1)

<dependency>

			<groupId>org.apache.hadoop</groupId>

			<artifactId>hadoop-client</artifactId>

			<version>2.3.0-mr1-cdh5.0.1</version>

		</dependency>

		<dependency>

			<groupId>org.apache.hadoop</groupId>

			<artifactId>hadoop-core</artifactId>

			<version>2.3.0-mr1-cdh5.0.1</version>

		</dependency>

		<!-- <dependency> <groupId>com.mapr.hadoop</groupId>
<artifactId>maprfs</artifactId>

			<version>1.0.3-mapr-2.1.3.1</version> </dependency> -->

		<dependency>

			<groupId>org.apache.hbase</groupId>

			<artifactId>hbase</artifactId>

			<version>0.96.1.1-cdh5.0.1</version>

			<type>pom</type>

		</dependency>

		<dependency>

			<groupId>org.apache.hbase</groupId>

			<artifactId>hbase-common</artifactId>

			<version>0.96.1.1-cdh5.0.1</version>

		</dependency>

				<dependency>

			<groupId>org.apache.httpcomponents</groupId>

			<artifactId>httpclient</artifactId>

			<version>4.1.1</version>

		</dependency>



2014-06-19 17:09:52,496 FATAL [main]
org.apache.hadoop.mapred.YarnChild: Error running child :
java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.compressionByName(Ljava/lang/String;)Lorg/apache/hadoop/hbase/io/compress/Compression$Algorithm;
	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:220)
	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:174)
	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:133)
	at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:558)
	at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
	at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:105)
	at org.apache.hadoop.hbase.mapreduce.PutSortReducer.reduce(PutSortReducer.java:72)
	at org.apache.hadoop.hbase.mapreduce.PutSortReducer.reduce(PutSortReducer.java:40)
	at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
	at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Thanks!

Chen

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message