hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Spreitzer <mspre...@us.ibm.com>
Subject Re: Using the Hadoop bundled in the lib directory of HBase
Date Thu, 10 Feb 2011 14:13:09 GMT
Yes, you've got it right.  Let me emphasize that what I did was *much* 
easier than the other way around --- which I tried first and in which I 
had problems.  The Cloudera release specifically depends on Sun security 
classes that are not in the Java (IBM's) that I used.  I tried building 
Hadoop's 0.20-append branch but had some difficulties and it took a long 
time.  The various build instructions I found all talked about running the 
regression test suite once or twice --- and a single run takes hours.  The 
first time I ran it, from a clean download and build, it had problems. And 
the instructions are confusion regarding building the native part.  The 
instructions seem to say you can build and test without building the 
native support; how can that be?

Regards,
Mike Spreitzer
SMTP: mspreitz@us.ibm.com, Lotus Notes: Mike Spreitzer/Watson/IBM
Office phone: +1-914-784-6424 (IBM T/L 863-)
AOL Instant Messaging: M1k3Sprtzr



From:   Suraj Varma <svarma.ng@gmail.com>
To:     user@hbase.apache.org
Date:   02/10/2011 08:02 AM
Subject:        Re: Using the Hadoop bundled in the lib directory of HBase



This procedure does seem a bit opposite of what I've seen folks recommend
(and the way it is documented in the notsoquick.html).But it might be
equivalent in this specific case (not completely sure as scripts etc are
different). I'll let one of the experts comment on that.

If I understood you right, you took the hadoop 0.20.2 release (which does
not have append support needed to prevent data loss in some situations) 
and
installed that. Next you took hbase 0.90.0 's hadoop-core.jar (which is 
from
a separately built branch-0.20-append and copied that over to the hadoop
installation.

What folks usually do is copy over the hadoop install's jar file over to
hbase - so, if you have a Cloudera install, you would copy over the 
Cloudera
built hadoop jar over to your hbase install (replacing the hbase hadoop
jar).

I'm guessing that in your specific situation since branch-0.20-append and
hadoop 0.20.2 are fairly close (other than the append changes), it "might"
work. But - not sure if this is what folks normally do ...

Can someone clarify this? The above procedure Mike followed certainly is
much simpler in this specific case as he doesn't have to built out his own
branch-0.20-append and rather "reuse" the one that was built for 
hbase-0.90.

Thanks,
--Suraj


On Mon, Feb 7, 2011 at 9:17 AM, Mike Spreitzer <mspreitz@us.ibm.com> 
wrote:

> After a few false starts, what I have done is: fetch the 0.20.2 release 
of
> hadoop core (which appears to be common + dfs + mapred), install it,
> delete hadoop/hadoop-core.jar, unpack the hbase distribution, copy its
> lib/hadoop-core-...jar file to hadoop/hadoop-...-core.jar, configure, 
and
> test.  It seems to be working.  Is that what you expected?  Should I
> expect subtle problems?
>
> If that was the right procedure, this could be explained a little more
> clearly at (http://hbase.apache.org/notsoquick.html#hadoop).  The first
> thing that set me on the wrong path was the statement that I have to
> either build my own Hadoop or use Cloudera; apparently that's not right, 
I
> can use a built release if I replace one jar in it.  That web page says 
"
> If you want to run HBase on an Hadoop cluster that is other than a 
version
> made from branch-0.20.append " (which is my case, using a standard
> release) "you must replace the hadoop jar found in the HBase lib 
directory
> with the hadoop jar you are running out on your cluster to avoid version
> mismatch issues" --- but I think it's the other way around in my case.
>
> Thanks,
> Mike Spreitzer
> SMTP: mspreitz@us.ibm.com, Lotus Notes: Mike Spreitzer/Watson/IBM
> Office phone: +1-914-784-6424 (IBM T/L 863-)
> AOL Instant Messaging: M1k3Sprtzr
>
>
>
> From:   Stack <stack@duboce.net>
> To:     user@hbase.apache.org
> Date:   02/07/2011 12:07 PM
> Subject:        Re: Using the Hadoop bundled in the lib directory of 
HBase
> Sent by:        saint.ack@gmail.com
>
>
>
> On Sun, Feb 6, 2011 at 9:31 PM, Vijay Raj <vijayraj@sargasdata.com> 
wrote:
> > Hadoop core contained hdfs / mapreduce , all bundled together until
> 0.20.x .
> >  Since 0.21, it got forked into common, hdfs and mapreduce 
sub-projects.
> >
>
> What Vijay said.
>
> > In this case - what is needed is a 0.20.2 download from hadoop and
> configuring
> > the same. The hadoop-0.20.2.jar needs to be replaced by the patched
> > hadoop-0.20.2-xxxx.jar available in HBASE_HOME/lib/*.jar directory, to
> make
> > things work .
> >
>
> This is a  little off.
>
> Here is our Hadoop story for 0.90.0:
> http://hbase.apache.org/notsoquick.html#hadoop
>
> It links to the branch.   If you need instruction on how to check out
> and build, just say (do we need to add pointers to book?)
>
> St.Ack
>
>


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message