hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Korb, Michael [USA]" <Korb_Mich...@bah.com>
Subject copying between hadoop instances
Date Tue, 08 Feb 2011 17:06:21 GMT
I have two Hadoop instances running on one cluster of machines for the purpose of upgrading.
I'm trying to copy all the files from the old instance to the new one but have been having
trouble with both distcp and fs -cp.

Most recently, I've been trying, "sudo -u hdfs ./hadoop fs -cp hftp://mc00001:50070/* hdfs://mc00000:55310/"
where mc00001 is the namenode of old hadoop and mc00000 is the namenode of new hadoop.

I've had some success with this command (some files have actually been copied), but part of
the way through the copy, I get this error:

cp: Server returned HTTP response code: 500 for URL: http://mc00000.mcloud.bah.com:50075/streamFile?filename=/user/cluster/annotated/2009/07/05/_logs/history/mc00002_1291306280950_job_201012021111_0518_cluster_com.bah.mapred.CombineFilesDriver%253A+netflow-smallfi&ugi=hdfs

Is it possible that there could be permissions issues? It also doesn't seem quite right to
be copying * since there are directories, but I don't think there's a way to call fs -cp recursively.
Could this be causing problems?

View raw message