Return-Path: Delivered-To: apmail-lucene-hadoop-user-archive@locus.apache.org Received: (qmail 95814 invoked from network); 29 Mar 2006 13:53:31 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (209.237.227.199) by minotaur.apache.org with SMTP; 29 Mar 2006 13:53:30 -0000 Received: (qmail 32023 invoked by uid 500); 29 Mar 2006 13:53:12 -0000 Delivered-To: apmail-lucene-hadoop-user-archive@lucene.apache.org Received: (qmail 32002 invoked by uid 500); 29 Mar 2006 13:53:11 -0000 Mailing-List: contact hadoop-user-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hadoop-user@lucene.apache.org Delivered-To: mailing list hadoop-user@lucene.apache.org Received: (qmail 31993 invoked by uid 99); 29 Mar 2006 13:53:11 -0000 Received: from asf.osuosl.org (HELO asf.osuosl.org) (140.211.166.49) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 29 Mar 2006 05:53:11 -0800 X-ASF-Spam-Status: No, hits=0.6 required=10.0 tests=NO_REAL_NAME X-Spam-Check-By: apache.org Received-SPF: pass (asf.osuosl.org: local policy) Received: from [193.203.240.6] (HELO psa2.houxou.com) (193.203.240.6) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 29 Mar 2006 05:53:10 -0800 Received: (qmail 29231 invoked from network); 29 Mar 2006 14:47:32 +0100 Received: from unknown (HELO localhost) (127.0.0.1) by 127.0.0.1 with SMTP; 29 Mar 2006 14:47:32 +0100 Received: from adsl-193.203.244.247.access.houxou.com (adsl-193.203.244.247.access.houxou.com [193.203.244.247]) by webmail.richmondinformatics.com (Horde MIME library) with HTTP; Wed, 29 Mar 2006 14:47:32 +0100 Message-ID: <20060329144732.keqpap9v4sggs00s@webmail.richmondinformatics.com> Date: Wed, 29 Mar 2006 14:47:32 +0100 From: monu.ogbe@richmondinformatics.com To: hadoop-user@lucene.apache.org Subject: Help: -copyFromLocal MIME-Version: 1.0 Content-Type: text/plain; charset=ISO-8859-1 Content-Disposition: inline Content-Transfer-Encoding: 7bit User-Agent: Internet Messaging Program (IMP) H3 (4.0.3) X-Virus-Checked: Checked by ClamAV on apache.org X-Spam-Rating: minotaur.apache.org 1.6.2 0/1000/N Hello Team, I created a backup of my DFS database: # bin/hadoop dfs -copyToLocal /user/root/crawl /mylocaldir I now want to restore from the backup using: # bin/hadoop dfs -copyFromLocal /mylocaldir/crawl /user/root However I'm getting the following error: copyFromLocal: Target /user/root/crawl/crawldb/current/part-00000/.data.crc already exists I get this message with every permutation of the command that I've tried, and even after totally deleting all content in the DFS directories. I'd be grateful for any pointers. Many thanks, Monu Ogbe