hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From praveenesh kumar <praveen...@gmail.com>
Subject Re: Does Hadoop 0.20.2 and HBase 0.90.3 compatible ??
Date Mon, 06 Jun 2011 07:11:56 GMT
Hello guys..!!!

I copied the hadoop-core-append jar  file from hbase/lib folder into hadoop
folder and replaced it with hadoop-0.20.2-core.jar
which was suggested in the following link

http://www.apacheserver.net/Using-Hadoop-bundled-in-lib-directory-HBase-at1136240.htm

I guess this is what have been mentioned in the link  and by Andy that I am
doing. If I am doing somehting wrong, kindly tell me.

But now after adding that jar file.. I am not able to run my hadoop.. I am
getting following exception messages on my screen

ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/util/PlatformName
ub13: Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.util.PlatformName
ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
ub13:   at java.security.AccessController.doPrivileged(Native Method)
ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
Program will exit.
ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/hdfs/server/datanode/DataNode
ub13: starting secondarynamenode, logging to
/usr/local/hadoop/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ub13.out
ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/util/PlatformName
ub13: Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.util.PlatformName
ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
ub13:   at java.security.AccessController.doPrivileged(Native Method)
ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
Program will exit.
ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/hdfs/server/namenode/SecondaryNameNode

Have I done something wrong.. Please guide me...!!

Thanks,
Praveenesh

On Fri, Jun 3, 2011 at 8:17 PM, Cosmin Lehene <clehene@adobe.com> wrote:

> Also I think we should make it more clear (on the front page) that HBase
> requires the append branch and provide a link to that as well.
>
> Cosmin
>
> On Jun 3, 2011, at 3:23 PM, Brian Bockelman wrote:
>
> > A meta-comment here (triggered by Praveenesh's comments about not wanting
> to re-install):
> >
> > If you have to install the same software on more than one piece of
> software, you really ought to automate it.  In the long run, you'll save
> more time if you can instantly roll out changes than manually doing it.  In
> all likelihood, if you're going to be working with a piece of software
> (Hadoop-based or not!), you'll re-install it a few times.
> >
> > The install of HDFS should take roughly the same amount of time on 2, 20,
> or 200 nodes.
> >
> > Brian
> >
> > On Jun 3, 2011, at 6:47 AM, Andrew Purtell wrote:
> >
> >>> Is *Hadoop 0.20.2  also not compatible with Hbase 0.90.3 ???*
> >>
> >> In a strict sense they are, but without append support HBase cannot
> guarantee that the last block of write ahead logs are synced to disk, so in
> some failure cases edits will be lost. With append support then the "hole"
> of these failure cases is closed. Also append branch adds a change to lease
> recovery that allows the HBase Master to take ownership of regionserver logs
> in order to split them quickly. (Without this patch you may be waiting 10
> minutes for lease recovery...) So these differences are clearly important
> for durability and fault recovery in a realistic time frame.
> >>
> >> A full reinstallation is not necessary.
> >>
> >> You can take the Hadoop core jar packaged in lib/ of HBase 0.90.3 and
> replace every Hadoop core jar on your cluster with them.
> >>
> >> OR
> >>
> >> You can compile 0.20-append and just replace the Hadoop core jar
> everywhere with the result, including the one in the HBase lib/.
> >>
> >> - Andy
> >>
> >> --- On Fri, 6/3/11, praveenesh kumar <praveenesh@gmail.com> wrote:
> >>
> >>> From: praveenesh kumar <praveenesh@gmail.com>
> >>> Subject: Does Hadoop 0.20.2 and HBase 0.90.3 compatible ??
> >>> To: common-user@hadoop.apache.org, user@hbase.apache.org
> >>> Date: Friday, June 3, 2011, 3:37 AM
> >>> Guys,
> >>>
> >>> I am in a very big big confusion. Please.. I really need
> >>> your feedbacks and
> >>> suggestions..
> >>>
> >>> The scenario is like this...
> >>>
> >>> I set up *Hadoop 0.20.2 cluster* of *12 nodes*..
> >>>
> >>> Now I set up* Hbase 0.90.3*  *12 node cluster* on top
> >>> of it.
> >>>
> >>> But after all that experimenting and struggling.. I read
> >>> the following SHOCKING line on my Hbase web UI
> >>>
> >>> ---* " You are currently running the HMaster without HDFS append
> support
> >>> enabled. This may result in data loss. Please see the HBase wiki for
> >>> details. "*
> >>>
> >>> And when I searched more about it.. I found Michael G. Nolls article..
> >>> saying that *Hadoop 0.20.2 and Hbase 0.90.2 are not compatible*.
> >>>
> >>> Is *Hadoop 0.20.2  also not compatible with Hbase 0.90.3 ???*
> >>>
> >>> So does  it means I have to re-install hadoop 0.20.
> >>> append if I want to use Hbase.
> >>>
> >>> I did a lot of struggle to reach upto this stage.. do I
> >>> have to do all of it again.. ??
> >>>
> >>> Is there any other work around solution.. of not
> >>> re-installing everything again ??
> >>>
> >>> Please help..!!!  :-(
> >>>
> >>> Thanks,
> >>> Praveenesh
> >>>
> >
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message