hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: Exception while using HBase trunk with hadoop - 2.0.3
Date Thu, 21 Feb 2013 15:54:46 GMT
Here was the issue:
HBASE-7715 FSUtils#waitOnSafeMode can incorrectly loop on standby NN

On Thu, Feb 21, 2013 at 7:48 AM, Anoop John <anoop.hbase@gmail.com> wrote:

> I am remembering that there was some discussion around use of
> dfs.setSafeMode() recently. Which issue I forgot
>
> -Anoop-
>
> On Thu, Feb 21, 2013 at 9:12 PM, Ted Yu <yuzhihong@gmail.com> wrote:
>
> > The exception was from hadoop layer - when waiting to get out of safe
> mode.
> > Here is the call:
> >
> >     } catch (Exception e) {
> >
> >       if (e instanceof IOException) throw (IOException) e;      // Check
> > whether dfs is on safemode.
> >
> >       inSafeMode = dfs.setSafeMode(
> >
> >         org.apache.hadoop.hdfs.protocol.FSConstants.SafeModeAction.
> > SAFEMODE_GET);
> > I wish we had logged the exception prior to the call. Looks like
> > setSafeMode() with boolean parameter is not supported.
> >
> > Cheers
> >
> > On Thu, Feb 21, 2013 at 7:24 AM, ramkrishna vasudevan <
> > ramkrishna.s.vasudevan@gmail.com> wrote:
> >
> > > Hi Devs
> > >
> > > I tried to run HBase current trunk snapshot with Hadoop 2.0.3 alpha.
> > >
> > > I got the following exception
> > > java.io.IOException: Failed on local exception:
> > > com.google.protobuf.InvalidProtocolBufferException: Message missing
> > > required fields: callId, status; Host Details : local host is: "ram/
> > > 10.239.47.144"; destination host is: "localhost":9000;
> > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1168)
> > > at
> > >
> > >
> >
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
> > > at $Proxy10.setSafeMode(Unknown Source)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > > at
> > >
> > >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
> > > at
> > >
> > >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
> > > at $Proxy10.setSafeMode(Unknown Source)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:514)
> > > at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:1896)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:660)
> > > at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:261)
> > > at
> org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:650)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:389)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:147)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:131)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:654)
> > > at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:476)
> > > at java.lang.Thread.run(Thread.java:662)
> > > Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
> > > missing required fields: callId, status
> > > at
> > >
> > >
> >
> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
> > > at
> > >
> > >
> >
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
> > > at
> > >
> > >
> >
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
> > > at
> > >
> > >
> >
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
> > > at
> > org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:886)
> > > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:817)
> > > 2013-02-20 20:44:01,928 INFO org.apache.hadoop.hbase.master.HMaster:
> > > Aborting
> > >
> > > I tried if there was something similar raised in the dev list.  Could
> not
> > > find one.
> > > But when i tried with hadoop - 1.0.4 it worked fine.
> > > Did anyone face this problem?
> > >
> > > Regards
> > > Ram
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message