Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id F04A6DB7E for ; Fri, 31 Aug 2012 01:16:29 +0000 (UTC) Received: (qmail 65921 invoked by uid 500); 31 Aug 2012 01:16:24 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 65849 invoked by uid 500); 31 Aug 2012 01:16:24 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 65841 invoked by uid 99); 31 Aug 2012 01:16:24 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 31 Aug 2012 01:16:24 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=FSL_RCVD_USER,HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of simsiss@gmail.com designates 209.85.210.176 as permitted sender) Received: from [209.85.210.176] (HELO mail-iy0-f176.google.com) (209.85.210.176) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 31 Aug 2012 01:16:17 +0000 Received: by iagt4 with SMTP id t4so4955256iag.35 for ; Thu, 30 Aug 2012 18:15:56 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=kWzzhswPAHYftGpwnkxKavV8zjmFyFlxw86q7CuzYws=; b=fjXlu5lWpN2z3NQH1GmziB7M4TxzNqjsbT9ir0pFvAxoq8pjC/oLNoTUgfwFn+Tn2h fgp1x/cwEoIYGslHXEurmOglwndwgjBWoB+d9vm2+c50/9u3M3qNSCYgmn5aKphbX4ca 3Y5T4sA73kl/MQgcWZ4WOmhXY+6OF5fX0v1FHDCL6vFSQgT3tSVSbUOR/K55LVcFdB+X Am1hdDbaH+JEgdD7+NDG/4e54AqqIM0JCYltJE8w/FlQbgESEzNcoRItmqMSLs8RgzBs oa7hW8E0NLL2tolCNKuKmlM32C76QjSyPn9NSt4NrTiI4R7pbi4MDtt21hGfP9UHq7D7 qoww== MIME-Version: 1.0 Received: by 10.42.44.71 with SMTP id a7mr6661517icf.46.1346375756143; Thu, 30 Aug 2012 18:15:56 -0700 (PDT) Received: by 10.43.49.69 with HTTP; Thu, 30 Aug 2012 18:15:55 -0700 (PDT) Received: by 10.43.49.69 with HTTP; Thu, 30 Aug 2012 18:15:55 -0700 (PDT) In-Reply-To: References: Date: Fri, 31 Aug 2012 10:15:55 +0900 Message-ID: Subject: Re: distcp error. From: =?EUC-KR?B?vcm6tLfE?= To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=bcaec51822bcadde2104c88588f7 --bcaec51822bcadde2104c88588f7 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Umsubscribe 2012. 8. 29. =EC=98=A4=EC=A0=84 12:44=EC=97=90 "Tao" =EB= =8B=98=EC=9D=B4 =EC=9E=91=EC=84=B1: > Hi, all**** > > I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.**** > > When the file path(or file name) contain Chinese character, an > exception will throw. Like below. I need some help about this.**** > > Thanks.**** > > **** > > ** ** > > ** ** > > ** ** > > [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log > /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/=E4=B8=AD=E6=96=87=E8=B7=AF= =E5=BE=84=E6=B5=8B=E8=AF=95 hdfs:// > 10.xx.xx.bb:54310/tmp/distcp_test14**** > > 12/08/28 23:32:31 INFO tools.DistCp: Input Options: > DistCpOptions{atomicCommit=3Dfalse, syncFolder=3Dfalse, deleteMissing=3Df= alse, > ignoreFailures=3Dtrue, maxMaps=3D14, sslConfigurationFile=3D'null', > copyStrategy=3D'uniformsize', sourceFileListing=3Dnull, > sourcePaths=3D[hftp://10.xx.xx.aa:50070/tmp/=E4=B8=AD=E6=96=87=E8=B7=AF= =E5=BE=84=E6=B5=8B=E8=AF=95], targetPath=3Dhdfs:// > 10.xx.xx.bb:54310/tmp/distcp_test14}**** > > 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log= * > *** > > 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated. > Instead, use mapreduce.task.io.sort.mb**** > > 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated. > Instead, use mapreduce.task.io.sort.factor**** > > 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoo= p > library for your platform... using builtin-java classes where applicable*= * > ** > > 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated. > Instead, use mapreduce.job.jar**** > > 12/08/28 23:32:36 WARN conf.Configuration: > mapred.map.tasks.speculative.execution is deprecated. Instead, use > mapreduce.map.speculative**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is > deprecated. Instead, use mapreduce.job.reduces**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class i= s > deprecated. Instead, use mapreduce.map.output.value.class**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is > deprecated. Instead, use mapreduce.job.map.class**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated. > Instead, use mapreduce.job.name**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is > deprecated. Instead, use mapreduce.job.inputformat.class**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is > deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class i= s > deprecated. Instead, use mapreduce.job.outputformat.class**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated= . > Instead, use mapreduce.job.maps**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is > deprecated. Instead, use mapreduce.map.output.key.class**** > > 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is > deprecated. Instead, use mapreduce.job.working.dir**** > > 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application > application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:804= 0 > **** > > 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job: > http://baby20:8088/proxy/application_1345831938927_0039/**** > > 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_003= 9 > **** > > 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039= * > *** > > 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running > in uber mode : false**** > > 12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%**** > > 12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%**** > > 12/08/28 23:33:00 INFO mapreduce.Job: Task Id : > attempt_1345831938927_0039_m_000000_0, Status : FAILED**** > > Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/ > =E4=B8=AD=E6=96=87=E8=B7=AF=E5=BE=84=E6=B5=8B=E8=AF=95/part-r-00017 --> h= dfs:// > 10.1.1.40:54310/tmp/distcp_test14/part-r-00017**** > > at > org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.ja= va:262) > **** > > at > org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)**** > > at > org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)**** > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)**** > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725= ) > **** > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)**** > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)**= * > * > > at java.security.AccessController.doPrivileged(Native Method)**** > > at javax.security.auth.Subject.doAs(Subject.java:396)**** > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1232) > **** > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)***= * > > Caused by: java.io.IOException: Couldn't run retriable-command: Copying > hftp://10.1.1.26:50070/tmp/=E4=B8=AD=E6=96=87=E8=B7=AF=E5=BE=84=E6=B5=8B= =E8=AF=95/part-r-00017 to hdfs:// > 10.1.1.40:54310/tmp/distcp_test14/part-r-00017**** > > at > org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.ja= va:101) > **** > > at > org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.ja= va:258) > **** > > ... 10 more**** > > Caused by: > org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException= : > java.io.IOException: HTTP_OK expected, received 500**** > > at > org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(Retriab= leFileCopyCommand.java:201) > **** > > at > org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(Retriab= leFileCopyCommand.java:167) > **** > > at > org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Ret= riableFileCopyCommand.java:112) > **** > > at > org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableF= ileCopyCommand.java:90) > **** > > at > org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(Retriab= leFileCopyCommand.java:71) > **** > > at > org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.ja= va:87) > **** > > ... 11 more**** > > Caused by: java.io.IOException: HTTP_OK expected, received 500**** > > at > org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkRespons= eCode(HftpFileSystem.java:381) > **** > > at > org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInpu= tStream.java:121) > **** > > at > org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInput= Stream.java:103) > **** > > at > org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.jav= a:158) > **** > > at java.io.DataInputStream.read(DataInputStream.java:132)**** > > at java.io.BufferedInputStream.read1(BufferedInputStream.java:256= ) > **** > > at java.io.BufferedInputStream.read(BufferedInputStream.java:317)= * > *** > > at java.io.FilterInputStream.read(FilterInputStream.java:90)**** > > at > org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStre= am.java:70) > **** > > at > org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(Retriab= leFileCopyCommand.java:198) > **** > > ... 16 more**** > > ** ** > > ** ** > > ** ** > --bcaec51822bcadde2104c88588f7 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable

Umsubscribe

2012. 8. 29. =EC=98=A4=EC=A0=84 12:44=EC=97=90 &= quot;Tao" <zta@outlook.com&g= t;=EB=8B=98=EC=9D=B4 =EC=9E=91=EC=84=B1:

Hi, all

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 When the file path(or file name) contain Chinese charact= er, an exception will throw. Like below. I need some help about this.

=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 Thanks.

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0

=C2=A0

=C2=A0

=C2=A0

= [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwri= te -log /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/=E4=B8=AD=E6=96=87=E8=B7=AF=E5=BE=84=E6=B5=8B=E8=AF=95<= /span> hdfs://10.xx.xx.bb:54310/tmp/distcp_test14=

12/08/28 23:32:31 INFO tools.Di= stCp: Input Options: DistCpOptions{atomicCommit=3Dfalse, syncFolder=3Dfalse= , deleteMissing=3Dfalse, ignoreFailures=3Dtrue, maxMaps=3D14, sslConfigurat= ionFile=3D'null', copyStrategy=3D'uniformsize', sourceFileL= isting=3Dnull, sourcePaths=3D[hftp://10.xx.xx.aa:50070/tmp/=E4=B8=AD=E6=96=87=E8=B7=AF=E5=BE=84=E6=B5=8B=E8= =AF=95], targetPath=3Dhdfs://10.xx.xx.bb:54310/tm= p/distcp_test14}

12/08/28 23:32:33 INFO tools.Di= stCp: DistCp job log path: /tmp/distcp.log

12/08/28 23:32:34 WARN conf.Configurati= on: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb=

12/08/28 23:32:34 WARN conf.Con= figuration: io.sort.factor is deprecated. Instead, use mapreduce.task.io.so= rt.factor

12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hado= op library for your platform... using builtin-java classes where applicable=

12/08/28 23:32:36 INFO mapreduc= e.JobSubmitter: number of splits:1

12/08/28 23:32:36 WARN conf.Configuration: mapr= ed.jar is deprecated. Instead, use mapreduce.job.jar

12/08/28 23:32:36 WARN conf.Con= figuration: mapred.map.tasks.speculative.execution is deprecated. Instead, = use mapreduce.map.speculative

12/08/28 23:32:36 WARN conf.Configuration: mapred.redu= ce.tasks is deprecated. Instead, use mapreduce.job.reduces

12/08/28 23:32:36 WARN = conf.Configuration: mapred.mapoutput.value.class is deprecated. Instead, us= e mapreduce.map.output.value.class

12/08/28 23:32:36 WARN conf.Con= figuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.m= ap.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name<= u>

12/08/28 23:32:36 WARN conf.Con= figuration: mapreduce.inputformat.class is deprecated. Instead, use mapredu= ce.job.inputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.output.= dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir=

12/08/28 23:32:36 WARN conf.Con= figuration: mapreduce.outputformat.class is deprecated. Instead, use mapred= uce.job.outputformat.class

<= span lang=3D"EN-US">12/08/28 23:32:36 WARN conf.Configuration: mapred.map.t= asks is deprecated. Instead, use mapreduce.job.maps

12/08/28 23:32:36 WARN conf.Con= figuration: mapred.mapoutput.key.class is deprecated. Instead, use mapreduc= e.map.output.key.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.working.d= ir is deprecated. Instead, use mapreduce.job.working.dir

12/08/28 23:32:37 INFO mapred.R= esourceMgrDelegate: Submitted application application_1345831938927_0039 to= ResourceManager at baby20/10.1.1.40:8040

12/08/28 23:32:37 INFO mapreduc= e.Job: The url to track the job: http://baby20:8088/proxy/appli= cation_1345831938927_0039/

12/08/28 23:32:37 INFO tools.Di= stCp: DistCp job-id: job_1345831938927_0039

12/08/28 23:32:37 INFO mapreduce.Job: = Running job: job_1345831938927_0039

12/08/28 23:32:50 INFO mapreduc= e.Job: Job job_1345831938927_0039 running in uber mode : false

12/08/28 23:32:50 I= NFO mapreduce.Job:=C2=A0 map 0% reduce 0%

12/08/28 23:33:00 INFO mapreduc= e.Job:=C2=A0 map 100% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job: Task Id : at= tempt_1345831938927_0039_m_000000_0, Status : FAILED

Error: java.io.IOException: Fil= e copy failed: hftp://10.1.1.26:50070/tmp/=E4= =B8=AD=E6=96=87=E8=B7=AF=E5=BE=84=E6=B5=8B=E8=AF=95/part-r-00017 --> hdfs://10.1.1.40:54310/tmp/distcp_test14/= part-r-00017

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry= (CopyMapper.java:262)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.had= oop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.ja= va:45)

= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapreduce.M= apper.run(Mapper.java:144)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:= 725)

= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.mapred.MapT= ask.run(MapTask.java:332)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152= )

=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.security.AccessController.d= oPrivileged(Native Method)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:396)

=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformat= ion.doAs(UserGroupInformation.java:1232)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)=

Caused= by: java.io.IOException: Couldn't run retriable-command: Copying hftp:= //10.1.1.26:50070= /tmp/=E4=B8=AD=E6=96=87=E8=B7= =AF=E5=BE=84=E6=B5=8B=E8=AF=95/part-r-00017 to = hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.tools.util.RetriableCommand.execute(Retri= ableCommand.java:101)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.had= oop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 ... 10 more

Caused by: org.apache.hadoop.tools.mapred.RetriableFileCo= pyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, receive= d 500

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.rea= dBytes(RetriableFileCopyCommand.java:201)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(Re= triableFileCopyCommand.java:167)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.cop= yToTmpFile(RetriableFileCopyCommand.java:112)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(Re= triableFileCopyCommand.java:90)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doE= xecute(RetriableFileCopyCommand.java:71)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableComma= nd.java:87)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, receive= d 500

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apac= he.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(Hftp= FileSystem.java:381)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.hado= op.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(= ByteRangeInputStream.java:103)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.a= pache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)<= u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at java.io.DataInputStream.read(DataInputStream.java:132)

=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.io.BufferedInputStream.read1(Buff= eredInputStream.java:256)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at java.io.BufferedInputStream.read(BufferedInputStream.java:3= 17)

=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.io.FilterInputStream.read(F= ilterInputStream.java:90)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.tools.util.ThrottledInputStream.read(Thro= ttledInputStream.java:70)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache= .hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCo= mmand.java:198)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 ... 16 more

=C2=A0

=C2=A0

=C2=A0

--bcaec51822bcadde2104c88588f7--