hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "卖报的小行家" <85469...@qq.com>
Subject Re:S3N copy creating recursive folders
Date Tue, 05 Mar 2013 14:32:00 GMT
Hi Subroto,


I didn't use the s3n filesystem.But  from the output "cp: java.io.IOException: mkdirs: Pathname
too long.  Limit 8000 characters, 1000 levels.", I think this is because the problem of the
path. Is the path longer than 8000 characters or the level is more than 1000?
You only have 998 folders.Maybe the last one is more than 8000 characters.Why not count the
last one's length?


BRs//Julian










------------------ Original ------------------
From:  "Subroto"<ssanyal@datameer.com>;
Date:  Tue, Mar 5, 2013 10:22 PM
To:  "user"<user@hadoop.apache.org>; 

Subject:  S3N copy creating recursive folders



Hi,

I am using Hadoop 1.0.3 and trying to execute:
hadoop fs -cp s3n://acessKey:acessSecret@bucket.name/srcData" /test/srcData

This ends up with:
cp: java.io.IOException: mkdirs: Pathname too long.  Limit 8000 characters, 1000 levels.

When I try to list the folder recursively /test/srcData: it lists 998 folders like:
drwxr-xr-x   - root supergroup          0 2013-03-05 08:49 /test/srcData/srcData
drwxr-xr-x   - root supergroup          0 2013-03-05 08:49 /test/srcData/srcData/srcData
drwxr-xr-x   - root supergroup          0 2013-03-05 08:49 /test/srcData/srcData/srcData/srcData
drwxr-xr-x   - root supergroup          0 2013-03-05 08:49 /test/srcData/srcData/srcData/srcData/srcData
drwxr-xr-x   - root supergroup          0 2013-03-05 08:49 /test/srcData/srcData/srcData/srcData/srcData/srcData

Is there a problem with s3n filesystem ??

Cheers,
Subroto Sanyal
Mime
View raw message