Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id CD48211A55 for ; Fri, 25 Jul 2014 01:15:45 +0000 (UTC) Received: (qmail 43937 invoked by uid 500); 25 Jul 2014 01:15:40 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 43804 invoked by uid 500); 25 Jul 2014 01:15:40 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 43794 invoked by uid 99); 25 Jul 2014 01:15:40 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 25 Jul 2014 01:15:40 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of justlooks@gmail.com designates 209.85.192.52 as permitted sender) Received: from [209.85.192.52] (HELO mail-qg0-f52.google.com) (209.85.192.52) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 25 Jul 2014 01:15:36 +0000 Received: by mail-qg0-f52.google.com with SMTP id f51so4232301qge.39 for ; Thu, 24 Jul 2014 18:15:12 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=ryNVUfcHONwD9c+fmI2n+t1XP6o3ZW13rmsju7WDWTI=; b=SvK1WCAOvaJI3iJgI+H1hZ8kx956OvUH9/9pM3qvITg2sOR61ZAKLQc+MEO+vUAnEH rodK2FcEr0AkXQ8kzw/UPRXEwfBtLU5nk9/oBNAhBTzGUcjvTdxt3qPcyWLCy0ZrFUnI JMvfS+3xr07pQ4kGJGDn3kKkGS7HUnqtWEn0E/0rtSGhHuDXwSZN5YYtNP3TD4QuSJ9n YS9kAW3itj2OmfAyaHDEZ/zS2O0wDvwGyxl5LOWZXnrqw62LjXE1uKRrRpjV5VglrKsK j6BBve17aSvQ39BKLpdnm0UB4Ds1HIF3B4Jhi/v7n/3tuCER+EYse47G/eSh8iXj3pg4 OGGA== MIME-Version: 1.0 X-Received: by 10.224.34.73 with SMTP id k9mr21518653qad.11.1406250911979; Thu, 24 Jul 2014 18:15:11 -0700 (PDT) Received: by 10.140.102.71 with HTTP; Thu, 24 Jul 2014 18:15:11 -0700 (PDT) Date: Fri, 25 Jul 2014 09:15:11 +0800 Message-ID: Subject: issue about distcp " Source and target differ in block-size. Use -pb to preserve block-sizes during copy." From: ch huang To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c2047c12e20004fefa4ea2 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c2047c12e20004fefa4ea2 Content-Type: text/plain; charset=UTF-8 hi,maillist: i try to copy data from my old cluster to new cluster,i get error ,how to handle this? 14/07/24 18:35:58 INFO mapreduce.Job: Task Id : attempt_1406182801379_0004_m_000000_1, Status : FAILED Error: java.io.IOException: File copy failed: webhdfs://CH22:50070/mytest/pipe_url_bak/part-m-00001 --> webhdfs://develop/tmp/pipe_url_bak/part-m-00001 at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.io.IOException: Couldn't run retriable-command: Copying webhdfs://CH22:50070/mytest/pipe_url_bak/part-m-00001 to webhdfs://develop/tmp/pipe_url_bak/part-m-00001 at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101) at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258) ... 10 more Caused by: java.io.IOException: Error writing request body to server at sun.net.www.protocol.http.HttpURLConnection$StreamingOutputStream.checkError(HttpURLConnection.java:3192) at sun.net.www.protocol.http.HttpURLConnection$StreamingOutputStream.write(HttpURLConnection.java:3175) at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122) at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58) at java.io.DataOutputStream.write(DataOutputStream.java:107) at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:231) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:164) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:118) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:95) at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87) ... 11 more 14/07/24 18:35:59 INFO mapreduce.Job: map 16% reduce 0% 14/07/24 18:39:39 INFO mapreduce.Job: map 17% reduce 0% 14/07/24 19:04:27 INFO mapreduce.Job: Task Id : attempt_1406182801379_0004_m_000000_2, Status : FAILED Error: java.io.IOException: File copy failed: webhdfs://CH22:50070/mytest/pipe_url_bak/part-m-00001 --> webhdfs://develop/tmp/pipe_url_bak/part-m-00001 at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.io.IOException: Couldn't run retriable-command: Copying webhdfs://CH22:50070/mytest/pipe_url_bak/part-m-00001 to webhdfs://develop/tmp/pipe_url_bak/part-m-00001 at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101) --001a11c2047c12e20004fefa4ea2 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
hi,maillist:
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 i try to = copy data from my old cluster=C2=A0to new cluster,i get error ,how to handl= e this?
=C2=A0
14/07/24 18:35:58 INFO mapreduce.Job: Task Id : attempt_1406182801379_= 0004_m_000000_1, Status : FAILED
Error: java.io.IOException: File copy f= ailed: webhdfs://CH22:50070/mytest/pipe_url_bak/part-m-00001 --> webhdfs= ://develop/tmp/pipe_url_bak/part-m-00001
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.mapre= d.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.CopyMapper.map(C= opyMapper.java:229)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapreduce.M= apper.run(Mapper.java:145)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at= org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapred.MapTask.ru= n(MapTask.java:340)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.ap= ache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.security.AccessControlle= r.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformat= ion.doAs(UserGroupInformation.java:1548)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapred.Yarn= Child.main(YarnChild.java:163)
Caused by: java.io.IOException: Couldn= 9;t run retriable-command: Copying webhdfs://CH22:50070/mytest/pipe_url_bak= /part-m-00001 to webhdfs://develop/tmp/pipe_url_bak/part-m-00001
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.util.= RetriableCommand.execute(RetriableCommand.java:101)
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.CopyMapper.copyF= ileWithRetry(CopyMapper.java:258)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 ... 10 more
Caused by: java.io.IOException: Error writing request= body to server
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.net.www.protocol.http.Htt= pURLConnection$StreamingOutputStream.checkError(HttpURLConnection.java:3192= )
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.net.www.protocol.htt= p.HttpURLConnection$StreamingOutputStream.write(HttpURLConnection.java:3175= )
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.io.BufferedOutputStream.= write(BufferedOutputStream.java:122)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FS= DataOutputStream.java:58)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.io.DataOutputStream.write(DataOutputStream.java:107)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.io.BufferedOutputStream.= write(BufferedOutputStream.java:122)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBy= tes(RetriableFileCopyCommand.java:231)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyTo= TmpFile(RetriableFileCopyCommand.java:164)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.mapre= d.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:118)
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.R= etriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:95)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.util.= RetriableCommand.execute(RetriableCommand.java:87)
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 ... 11 more
14/07/24 18:35:59 INFO mapreduce.Job:=C2=A0 map 16% reduce 0%
14/07= /24 18:39:39 INFO mapreduce.Job:=C2=A0 map 17% reduce 0%
14/07/24 19:04:= 27 INFO mapreduce.Job: Task Id : attempt_1406182801379_0004_m_000000_2, Sta= tus : FAILED
Error: java.io.IOException: File copy failed: webhdfs://CH22:50070/mytest/p= ipe_url_bak/part-m-00001 --> webhdfs://develop/tmp/pipe_url_bak/part-m-0= 0001
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.too= ls.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.mapre= d.CopyMapper.map(CopyMapper.java:229)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:= 45)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapr= educe.Mapper.run(Mapper.java:145)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapred.MapT= ask.runNewMapper(MapTask.java:764)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapred.YarnChild$= 2.run(YarnChild.java:168)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.security.AccessControlle= r.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformat= ion.doAs(UserGroupInformation.java:1548)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapred.Yarn= Child.main(YarnChild.java:163)
Caused by: java.io.IOException: Couldn= 9;t run retriable-command: Copying webhdfs://CH22:50070/mytest/pipe_url_bak= /part-m-00001 to webhdfs://develop/tmp/pipe_url_bak/part-m-00001
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.tools.util.= RetriableCommand.execute(RetriableCommand.java:101)
--001a11c2047c12e20004fefa4ea2--