hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: issue about snapshot migration
Date Thu, 23 Oct 2014 02:09:18 GMT
Are the two clusters using same version of hadoop ?

See '2.1.1.1. Apache HBase 0.94 with Hadoop 2' under
http://hbase.apache.org/book.html#d0e1440

Cheers

On Wed, Oct 22, 2014 at 6:34 PM, ch huang <justlooks@gmail.com> wrote:

> hi,maillist:
>
>        i have two hbase cluster ,one is version 0.94.6 another is 0.98.1 ,i
> try to mv data of table A from 0.94.6 to 0.98.1 ,i make the snapshot of A
> ,then want to use ExportSnapshot to mv snapshot to hdfs cluster which hbase
> 0.98.1 on,but failed ,why?
>
> # hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot -snapshot demo_shot
> -copy-to hdfs://192.168.26.231:8020/hbase
> Exception in thread "main" java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Message missing
> required fields: callId, status; Host Details : local host is: "ch11/
> 192.168.11.11"; destination host is: "hzih231":8020;
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1241)
>         at
>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>         at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>         at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>         at
>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:629)
>         at
> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1545)
>         at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:820)
>         at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1378)
>         at
>
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:618)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at
>
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:705)
>         at
>
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:709)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message
> missing required fields: callId, status
>         at
>
> com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>         at
>
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
>         at
>
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>         at
>
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:948)
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:846)
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message