hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Seth Yang <accounts.y...@gmail.com>
Subject Re: Exception while using HBase trunk with hadoop - 2.0.3
Date Thu, 13 Jun 2013 06:05:38 GMT

ramkrishna vasudevan <ramkrishna.s.vasudevan@...> writes:

> 
> It works now.  Thanks Ted for your time and help.
> 
> Regards
> Ram
> 
> On Fri, Feb 22, 2013 at 10:48 AM, Ted Yu <yuzhihong@...> wrote:
> 
> > Ram:
> > Here is what we have in pom.xml:
> >     <hadoop-two.version>2.0.2-alpha</hadoop-two.version>
> >
> > You can try the patch from HBASE-7904 and rebuild your HBase tar ball.
> >
> > Cheers
> >
> > On Thu, Feb 21, 2013 at 8:56 PM, Ted Yu <yuzhihong@...> wrote:
> >
> > > This indicates that the hadoop 2.0 HBase got built with lags the 
binary
> > > running as Namenode.
> > >
> > > Cheers
> > >
> > >
> > > On Thu, Feb 21, 2013 at 8:42 PM, ramkrishna vasudevan <
> > > ramkrishna.s.vasudevan@...> wrote:
> > >
> > >> During this time NN says
> > >>
> > >> Incorrect header or version mismatch from 127.0.0.1:34789 got version 
7
> > >> expected version 8
> > >>
> > >> Regards
> > >> Ram
> > >>
> > >>
> > >> On Thu, Feb 21, 2013 at 8:54 PM, ramkrishna vasudevan <
> > >> ramkrishna.s.vasudevan@...> wrote:
> > >>
> > >> > Hi Devs
> > >> >
> > >> > I tried to run HBase current trunk snapshot with Hadoop 2.0.3 
alpha.
> > >> >
> > >> > I got the following exception
> > >> > java.io.IOException: Failed on local exception:
> > >> > com.google.protobuf.InvalidProtocolBufferException: Message missing
> > >> > required fields: callId, status; Host Details : local host is: 
"ram/
> > >> > 10.239.47.144"; destination host is: "localhost":9000;
> > >> >  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)
> > >> > at org.apache.hadoop.ipc.Client.call(Client.java:1168)
> > >> >  at
> > >> >
> > >>
> > 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.jav
a:202)
> > >> > at $Proxy10.setSafeMode(Unknown Source)
> > >> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >> > at
> > >> >
> > >>
> > 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
)
> > >> >  at
> > >> >
> > >>
> > 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:25)
> > >> > at java.lang.reflect.Method.invoke(Method.java:597)
> > >> >  at
> > >> >
> > >>
> > 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocati
onHandler.java:164)
> > >> > at
> > >> >
> > >>
> > 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHand
ler.java:83)
> > >> >  at $Proxy10.setSafeMode(Unknown Source)
> > >> > at
> > >> >
> > >>
> > 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafe
Mode(ClientNamenodeProtocolTranslatorPB.java:514)
> > >> >  at 
org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:1896)
> > >> > at
> > >> >
> > >>
> > 
org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSyst
em.java:660)
> > >> >  at
> > org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:261)
> > >> > at
> > org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:650)
> > >> >  at
> > >> >
> > >>
> > 
org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSyste
m.java:389)
> > >> > at
> > >> >
> > >>
> > 
org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayou
t(MasterFileSystem.java:147)
> > >> >  at
> > >> >
> > >>
> > org.apache.hadoop.hbase.master.MasterFileSystem.<init>
(MasterFileSystem.java:131)
> > >> > at
> > >> >
> > >>
> > 
org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:654
)
> > >> >  at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:476)
> > >> > at java.lang.Thread.run(Thread.java:662)
> > >> > Caused by: com.google.protobuf.InvalidProtocolBufferException: 
Message
> > >> > missing required fields: callId, status
> > >> > at
> > >> >
> > >>
> > 
com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferExc
eption(UninitializedMessageException.java:81)
> > >> >  at
> > >> >
> > >>
> > 
org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto
$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
> > >> > at
> > >> >
> > >>
> > 
org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto
$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
> > >> >  at
> > >> >
> > >>
> > 
org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto
.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
> > >> > at
> > >> 
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:886)
> > >> >  at org.apache.hadoop.ipc.Client$Connection.run(Client.java:817)
> > >> > 2013-02-20 20:44:01,928 INFO 
org.apache.hadoop.hbase.master.HMaster:
> > >> > Aborting
> > >> >
> > >> > I tried if there was something similar raised in the dev list.  
Could
> > >> not
> > >> > find one.
> > >> > But when i tried with hadoop - 1.0.4 it worked fine.
> > >> > Did anyone face this problem?
> > >> >
> > >> > Regards
> > >> > Ram
> > >> >
> > >>
> > >
> > >
> >
> 

Hi Ram,

I encountered exactly the same issue and hope can get more details to solve 
this.

I followed your post, and have done the following:
1. added the line  “<hadoop-two.version>2.0.2-alpha</hadoop-two.version>”
to 
hbase's pom.xml
2. Rebuilt with mvn clean install -Dhadoop.profile=2.0 -DskipTests 
assembly:assembly

But I still encountered the same error.

(Also I tried to apply patch HBASE-7904, but don't know where to start 
with).

Wonder if you can give more details on what you did so you problem is 
solved?

Thanks. 

Seth



Mime
View raw message