hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mapred Learn <mapred.le...@gmail.com>
Subject Re: Error while using distcp
Date Mon, 02 May 2011 20:41:25 GMT
Hi,
Follwing the same email chain, what ports are needed to be open for distcp
to work between two hadoop clusters ?


Thanks,



On Mon, Apr 18, 2011 at 6:02 PM, sonia gehlot <sonia.gehlot@gmail.com>wrote:

> Sorry guys, it was typos it works.
>
> Thanks,
> Sonia
>
> On Mon, Apr 18, 2011 at 5:45 PM, sonia gehlot <sonia.gehlot@gmail.com
> >wrote:
>
> > Yes same versions of hadoop on both the clusters.
> >
> >
> >
> >
> > On Mon, Apr 18, 2011 at 5:42 PM, James Seigel <james@tynt.com> wrote:
> >
> >> Same versions of hadoop in each cluster?
> >>
> >> Sent from my mobile. Please excuse the typos.
> >>
> >> On 2011-04-18, at 6:31 PM, sonia gehlot <sonia.gehlot@gmail.com> wrote:
> >>
> >> > Hi All,
> >> >
> >> > I am trying to copy files from one hadoop cluster to another hadoop
> >> cluster
> >> > but I am getting following error:
> >> >
> >> > [phx1-rb-bi-dev50-metrics-qry1:]$ scripts/hadoop.sh distcp
> >> > hftp://c17-dw-dev50-hdfs-dn-n1:50070/user/sgehlot/fact_lead.v0.txt.gz
> \
> >> > hdfs://phx1-rb-dev40-pipe1.cnet.com:9000/user/sgehlot
> >> > HADOOP_HOME: /home/sgehlot/cnwk-hadoop/hadoop/0.20.1
> >> > HADOOP_CONF_DIR:
> >> /home/sgehlot/cnwk-hadoop/config/hadoop/0.20.1/conf_rb-dev
> >> > *11/04/18 17:12:23 INFO tools.DistCp:
> >> >
> >>
> srcPaths=[hftp://c17-dw-dev50-hdfs-dn-n1:50070/user/sgehlot/fact_lead.v0.txt.gz]
> >> > 11/04/18 17:12:23 INFO tools.DistCp: destPath=hdfs://
> >> > phx1-rb-dev40-pipe1.cnet.com:9000/user/sgehlot
> >> > [Fatal Error] :1:186: XML document structures must start and end
> within
> >> the
> >> > same entity.
> >> > With failures, global counters are inaccurate; consider running with
> -i
> >> > Copy failed: java.io.IOException: invalid xml directory content
> >> > *        at
> >> >
> >>
> org.apache.hadoop.hdfs.HftpFileSystem$LsParser.fetchList(HftpFileSystem.java:239)
> >> >        at
> >> >
> >>
> org.apache.hadoop.hdfs.HftpFileSystem$LsParser.getFileStatus(HftpFileSystem.java:244)
> >> >        at
> >> >
> >>
> org.apache.hadoop.hdfs.HftpFileSystem.getFileStatus(HftpFileSystem.java:273)
> >> >        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:689)
> >> >        at org.apache.hadoop.tools.DistCp.checkSrcPath(DistCp.java:621)
> >> >        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:638)
> >> >        at org.apache.hadoop.tools.DistCp.run(DistCp.java:857)
> >> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> >> >        at org.apache.hadoop.tools.DistCp.main(DistCp.java:884)
> >> > Caused by: org.xml.sax.SAXParseException: XML document structures must
> >> start
> >> > and end within the same entity.
> >> >        at
> >> >
> >>
> com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1231)
> >> >        at
> >> >
> >>
> org.apache.hadoop.hdfs.HftpFileSystem$LsParser.fetchList(HftpFileSystem.java:233)
> >> >        ... 9 more
> >> >
> >> > Any idea why I am getting this.
> >> >
> >> > Thanks,
> >> > Sonia
> >>
> >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message