hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ramkrishna vasudevan <ramkrishna.s.vasude...@gmail.com>
Subject Re: Exception while using HBase trunk with hadoop - 2.0.3
Date Fri, 22 Feb 2013 04:27:25 GMT
Just to add on

As i said i have two setups.  Verified the HBase lib dir in these two
setups.  The one complied with profile 2.0 has hadoop 2.0 jars.  And the
one compiled with profile 1.0 has hadoop 1.0 jars.

I used to ways of creating and compiling this package
mvn clean install -Dhadoop.profile=2.0 -DskipTests assembly:assembly
mvn -X -DskipTests help:active-profiles package assembly:assembly -Prelease
-Dhadoop.profile=2.0

Both did not help me.

>From the logs i can see that the FileSystem.get() works fine.
But when the DFSClient in master and the NN talks to each other, the MAster
sends hostname/ip. whereas the NN replies with hostname:port.

Contents of core-site.xml
====================
<configuration>
<property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000</value>
     </property>

</configuration>
Contents of hdfs-site.xml
=====================
<configuration>
<property>
    <name>dfs.namenode.name.dir</name>
    <value>/home/ram/datadir</value>
     </property>
</configuration>
Content of hbase-site.xml
=======================
<configuration>
 <property>
    <name>hbase.rootdir</name>
    <value>hdfs://localhost:9000/hbase</value>
  </property>
  <property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>
  <property>
    <name>hbase.zookeeper.quorum</name>
    <value>localhost</value>
  </property>
</configuration>

This is just a single node machine.  Also am trying out the HBase trunk
with hadoop 2.0 for the first time.

Regards
Ram
On Thu, Feb 21, 2013 at 8:54 PM, ramkrishna vasudevan <
ramkrishna.s.vasudevan@gmail.com> wrote:

> Hi Devs
>
> I tried to run HBase current trunk snapshot with Hadoop 2.0.3 alpha.
>
> I got the following exception
> java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Message missing
> required fields: callId, status; Host Details : local host is: "ram/
> 10.239.47.144"; destination host is: "localhost":9000;
>  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)
> at org.apache.hadoop.ipc.Client.call(Client.java:1168)
>  at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
> at $Proxy10.setSafeMode(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>  at $Proxy10.setSafeMode(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:514)
>  at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:1896)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:660)
>  at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:261)
> at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:650)
>  at
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:389)
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:147)
>  at
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:131)
> at
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:654)
>  at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:476)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
> missing required fields: callId, status
> at
> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>  at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
> at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>  at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:886)
>  at org.apache.hadoop.ipc.Client$Connection.run(Client.java:817)
> 2013-02-20 20:44:01,928 INFO org.apache.hadoop.hbase.master.HMaster:
> Aborting
>
> I tried if there was something similar raised in the dev list.  Could not
> find one.
> But when i tried with hadoop - 1.0.4 it worked fine.
> Did anyone face this problem?
>
> Regards
> Ram
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message