hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andy Li" <annndy....@gmail.com>
Subject Re: Question about exception MasterNotRunningException when running HBaseAdmin and HTable in MapReduce program
Date Fri, 29 Feb 2008 06:23:16 GMT
Thanks.  By specifying HBase configuration path in the HADOOP_CLASSPATH
solved the problem.

An example here:
export HBASE_HOME=/hadoop/contrib/hbase
export HADOOP_CLASSPATH=$HBASE_HOME/hbase-
0.2.0-dev.jar:$HBASE_HOME/hbase-0.2.0-dev-test.jar:$HBASE_HOME/lib/:$HBASE_HOME/conf

It looks like explicitly specifying HBASE_CONF_DIR is not effective.

On Thu, Feb 28, 2008 at 9:59 PM, stack <stack@duboce.net> wrote:

> Is HBASE_HOME defined when the hadoop-env.sh runs?  Try expanding it?
> Rather than adding HBASE_CONF_DIR, add the conf dir to the
> HADOOP_CLASSPATH (I just updated the example in the wiki page to include
> the conf dir if that helps).
>
> Your code looks fine.
> St.Ack
>
> Andy Li wrote:
> > Thanks for your response.
> >
> > Here is the following configuration I am using in hadoop-env.sh.
> > For those not pasted, they are the default values within hadoop-env.shalong
> > the package.
> >
> > ======= COPY/PASTE STARTS HERE ========
> > export JAVA_HOME=/usr/java/jdk1.6.0_04
> > export HBASE_HOME=/opt/hadoop/contrib/hbase
> > export HADOOP_CLASSPATH=$HBASE_HOME/hbase-
> > 0.2.0-dev.jar:$HBASE_HOME/hbase-0.2.0-dev-test.jar:$HBASE_HOME/lib/
> > ======= COPY/PASTE ENDS HERE ========
> >
> > Do I have to explicitly specified HBASE_CONF_DIR in hadoop-env.sh?
> >
> > I'm still getting the same exception.  Here is a snippet of my code that
> > tries to access HBase.
> > ======= COPY/PASTE STARTS HERE ========
> > HBaseConfiguration _CONF = new HBaseConfiguration();
> > Text t_ratio = new Text("ratio:");
> > Text _TABLENAME = new Text("andrew-intelligent-table");
> > HTableDescriptor _DESC = new HTableDescriptor("test-mapreduce-table");
> > _DESC.addFamily(new HColumnDescriptor(new String("ratio:")));
> > HBaseAdmin _ADMIN = null;
> > HTable _TABLE = null;
> > try {
> >     _ADMIN = new HBaseAdmin(_CONF);
> >     if ( !(_ADMIN.tableExists(_TABLENAME)) ) {
> >         System.out.println("Table don't exist, create new table");
> >         _ADMIN.createTable(_DESC);
> >     }
> >     _TABLE = new HTable(_CONF, _TABLENAME);
> > } catch (Exception exx) {
> >     System.out.println("Error happen when opening the tables");
> >     exx.printStackTrace();
> > }
> > ======= COPY/PASTE ENDS HERE ========
> >
> > Any help or input is appreciated.  Thanks.
> >
> > -annndy
> >
> > On Thu, Feb 28, 2008 at 8:27 PM, Peeyush Bishnoi <peeyushb@yahoo-inc.com
> >
> > wrote:
> >
> >
> >> Hi Andy ,
> >>
> >> Check for the $HADOOP_CLASSPATH , in hadoop-env.sh , which should have
> >> entries with hbase jar file and HBASE_CONF_DIR and entries should be
> >> exported properly. This hadoop-env.sh should be set on all the machines
> >> in a cluster where your HRegionserver is running .
> >>
> >> hbase-site.xml file should have HMaster value set .
> >>
> >> ---
> >> Peeyush
> >>
> >>
> >> On Thu, 2008-02-28 at 18:58 -0800, Andy Li wrote:
> >>
> >>
> >>> Dears,
> >>>
> >>> Not sure if anyone encountered the same problem.
> >>> I am running hadoop 0.17 with HBase 0.2 on Linux x86_64 with JDK 1.6.
> >>>
> >>> I get the following exception in my program when I tries to invoke the
> >>> HBaseAdmin
> >>> and HTable in the 'reduce' phase to insert data into the table.  I
> keep
> >>> getting the same
> >>> exception when I try to get a HTable reference.  I can run Hbase
> Shell,
> >>>
> >> but
> >>
> >>> cannot make
> >>> the MR program work.
> >>>
> >>> org.apache.hadoop.hbase.MasterNotRunningException
> >>> at
> >>>
> >>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getMaster
> >>
> >>> (HCon
> >>> nectionManager.java:206)
> >>> at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java
> :70)
> >>> at hbasetest.mapreduce.URLHBaseDictReduce.reduce(
> URLHBaseDictReduce.java
> >>>
> >> :68)
> >>
> >>> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:333)
> >>> at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java
> >>>
> >> :2089)
> >>
> >>> java.lang.NullPointerException
> >>> at hbasetest.mapreduce.URLHBaseDictReduce.reduce(
> URLHBaseDictReduce.java
> >>> :148)
> >>> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:333)
> >>> at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java
> >>>
> >> :2089)
> >>
> >>>
> >>> Any idea?
> >>>
> >>> Thanks,
> >>> annndy
> >>>
> >
> >
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message