hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Suraj Varma <svarma...@gmail.com>
Subject Re: Copy Data from one cluster to another
Date Mon, 03 Jan 2011 19:13:39 GMT
What version of HBase are you using? With 0.90RC, there is a CopyTable MR
job that might help you ...https://issues.apache.org/jira/browse/HBASE-2221

With 0.20.x, here's one user account of how they copied HBase to another
data center ... this might help you.
http://wiki.apache.org/hadoop/Hbase/MigrationToNewCluster

--Suraj


On Sat, Jan 1, 2011 at 5:11 AM, Saumitra Chowdhury <
saumitra@smartitengineering.com> wrote:

> Hi all,
> I am new in hbase and having the following problem.
> The situation is that,There have two Distributed cluster with few RSs and a
> Master in two different Network. Is It possible two transfer all data from
> one to another?How?
>
> Besides I was trying to move all data from one cluster to a standalone
> Hbase
> in my PC.
> For this i tried the command  ::
>                                   "hadoop fs -copyToLocal
> hdfs://smartrs1:60070/path /path/to/local/" [smartrs1 is the name of the
> server running with NN and Hbase Master]
>
> The following error occars :: *copyToLocal: Call to smartrs1/
> 192.168.1.3:60070 failed on local exception: java.io.EOFException*
> Again i tried to use the local jobtracker in that purpuse with the
> following
> command on local hdfs::
>      " hadoop distcp -jt local hdfs://smartrs1:60070/path
> file:///path/to/local "  [smartrs1 is the name of the server running with
> NN
> and Hbase Master with ip 192.168.1.3]
> In that case following error occurs
> Can anyone help me in both these casees?
> *Copy failed: java.io.IOException: Call to smartrs1/192.168.1.3:60070failed
> on local exception: java.io.EOFException*
>    at org.apache.hadoop.ipc.Client.
> wrapException(Client.java:1089)
>    at org.apache.hadoop.ipc.Client.call(Client.java:1057)
>    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
>    at $Proxy0.getProtocolVersion(Unknown Source)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:369)
>    at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
>    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
>    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
>    at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>    at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1489)
>    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>    at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1523)
>    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1505)
>    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:227)
>    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
>    at org.apache.hadoop.tools.DistCp.checkSrcPath(DistCp.java:635)
>    at org.apache.hadoop.tools.DistCp.copy(DistCp.java:656)
>    at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
>    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>    at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
>    at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:781)
>    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:689)
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message