hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hao Ren <h....@claravista.fr>
Subject Re: copy files from ftp to hdfs in parallel, distcp failed
Date Mon, 15 Jul 2013 09:52:10 GMT
Thank you, Ram

I have configured core-site.xml as following:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->








Then I tried  hadoop fs -ls file:/// , it works.
But hadoop fs -ls ftp://<login>:<password>@<ftp server ip>/<directory>/

doesn't work as usual:
     ls: Cannot access ftp://<user>:<password>@<ftp server 
ip>/<directory>/: No such file or directory.

When ignoring <directroy> as :

hadoop fs -ls ftp://<login>:<password>@<ftp server ip>/

There are no error msgs, but it lists nothing.

I have also check the rights for my /home/<user> directroy:

drwxr-xr-x 11 <user> <user>  4096 jui 11 16:30 <user>

and all the files under /home/<user> have rights 755.

I can easily copy the link ftp://<user>:<password>@<ftp server 
ip>/<directory>/ to firefox, it lists all the files as expected.

Any workaround here ?

Thank you.

Le 12/07/2013 14:01, Ram a écrit :
> Please configure the following in core-ste.xml and try.
>    Use hadoop fs -ls file:///  -- to display local file system files
>    Use hadoop fs -ls ftp://<your ftp location>   -- to display ftp 
> files if it is listing files go for distcp.
> reference from 
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/core-default.xml
> fs.ftp.host 	FTP filesystem connects to this server
> fs.ftp.host.port 	21 	FTP filesystem connects to fs.ftp.host on this port

Hao Ren

View raw message