hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From lars hofhansl <lhofha...@yahoo.com>
Subject Some HBase M/R confusion
Date Thu, 23 Feb 2012 02:36:08 GMT
According to the documentation there are two ways to run HBase M/R jobs:

1. The HBase book states to run M/R jobs like export here: http://hbase.apache.org/book/ops_mgt.html#export
bin/hbase org.apache.hadoop.hbase.mapreduce.Export <tablename> <outputdir> [<versions>
[<starttime> [<endtime>]]]

2. Whereas the Javadoc says here: http://hbase.apache.org/docs/current/api/org/apache/hadoop/hbase/mapreduce/package-summary.html#package_description
HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar ${HBASE_HOME}/hbase-0.90.0.jar
export ...

In the first case (#1) I find that the job allways fails to create the output dir:
java.io.IOException: Mkdirs failed to create file:/exports/_temporary/_attempt_local_0001_m_000000_0
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:378)


In the 2nd case (#2) I get past the creation of the output dir, and then it fails because
it cannot find class com.google.protobuf.Message.
I am using the HBase security branch and find that I need to add com.google.protobuf.Message.class
in TableMapReduceUtil.addDependencyJars.
If I do that, I can successfully run an export jobs using method #2.

The 2nd issue I found looks like a bug with the HBase security branch.
I am not sure about the first issue, is the documentation in the HBase book outdated?

-- Lars

View raw message