hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Rapplean <robert.rappl...@trueffect.com>
Subject RE: hftp can list directories but won't send files
Date Wed, 19 Dec 2012 16:38:23 GMT
Thanks, Arpit. Didn't even know I was on Cloudera. I'll go bug them.

Robert Rapplean
Senior Software Engineer
303-872-2256  direct  | 303.438.9597  main | www.trueffect.com

From: Arpit Gupta [mailto:arpit@hortonworks.com]
Sent: Tuesday, December 18, 2012 7:01 PM
To: user@hadoop.apache.org
Subject: Re: hftp can list directories but won't send files

Robert

Another thing you can try is

export HADOOP_ROOT_LOGGER=DEBUG,console and run the hadoop dfs -cat command with hftp and
you should get more logs on the client.

Also since you are running cdh it might be better to ask on the cdh mailing lists.

--
Arpit Gupta
Hortonworks Inc.
http://hortonworks.com/

On Dec 18, 2012, at 3:28 PM, Robert Rapplean <robert.rapplean@trueffect.com<mailto:robert.rapplean@trueffect.com>>
wrote:


The cluster says this:

Hadoop 2.0.0-cdh4.0.0
Subversion file:///data/1/jenkins/workspace/generic-package-rhel64-6-0/topdir/BUILD/hadoop-2.0.0-cdh4.0.0/src/hadoop-common-project/hadoop-common<file:///\\data\1\jenkins\workspace\generic-package-rhel64-6-0\topdir\BUILD\hadoop-2.0.0-cdh4.0.0\src\hadoop-common-project\hadoop-common>
-r 5d678f6bb1f2bc49e2287dd69ac41d7232fc9cdc
Compiled by jenkins on Mon Jun  4 16:52:21 PDT 2012
>From source with checksum 64f877fc49f5adc0d7d55c13089e866e

Which put a tiny strain on my knowledge to retrieve. Can you make a suggestion regarding which
logs you want to look at?

Robert Rapplean
Senior Software Engineer
303-872-2256  direct  | 303.438.9597  main | www.trueffect.com<http://www.trueffect.com>


-----Original Message-----
From: Harsh J [mailto:harsh@cloudera.com<http://cloudera.com>]
Sent: Tuesday, December 18, 2012 4:17 PM
To: <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Subject: Re: hftp can list directories but won't send files

What version/distribution of Hadoop is your source cluster?

Also, I'd take a look at a your NN's and a few of your DN's logs right after encountering
this issue, to see the reason+stacktrace printed for the Server Error 500 (a code for a server-end
fault). That'd give us more ideas on whys.

On Wed, Dec 19, 2012 at 4:13 AM, Robert Rapplean <robert.rapplean@trueffect.com<mailto:robert.rapplean@trueffect.com>>
wrote:

Hey, everone. Just got finished reading about all of the unsubscribe messages in Sept-Oct,
and was hoping someone has a clue about what my system is doing wrong. I suspect that this
is a configuration issue, but I don't even know where to start looking for it. I'm a developer,
and my sysadmin is tied up until the end of the year.

I'm trying to move files from one cluster to another using distcp, using the hftp protocol
as specified in their instructions.

I can read directories over hftp, but when I attempt to get a file I get a 500 (internal server
error). To eliminate the possibility of network and firewall issues, I'm using hadoop fs -ls
and hadoop fs -cat commands on the source server in order to attempt to figure out this issue.

This provides a directory of the files, which is correct.

hadoop fs -ls ourlogs/day_id=19991231/hour_id=1999123123
-rw-r--r--   3 username supergroup        812 2012-12-16 17:21 logfiles/day_id=19991231/hour_id=1999123123/000008_0

This gives me a "file not found" error, which is also correct because the file isn't there:

hadoop fs -cat
hftp://hdenn00.trueffect.com:50070/user/username/logfiles/day_id=19991
231/hour_id=1999123123/000008_0x
cat:
`hftp://hdenn00.trueffect.com:50070/user/prodman/ods_fail/day_id=19991
231/hour_id=1999123123/000008_0x': No such file or directory

This line gives me a 500 internal server error. The file is confirmed to be on the server.

hadoop fs -cat
hftp://hdenn00.trueffect.com:50070/user/username/logfiles/day_id=19991
231/hour_id=1999123123/000008_0
cat: HTTP_OK expected, received 500

Here is a stack trace of what distcp logs when I attempt this:

java.io.IOException: HTTP_OK expected, received 500
   at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderUrlOpener.connect(HftpFileSystem.java:365)
   at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:119)
   at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
   at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:187)
   at java.io.DataInputStream.read(DataInputStream.java:83)
   at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:424)
   at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:547)
   at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:314)
   at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
   at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
   at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
   at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
   at java.security.AccessController.doPrivileged(Native Method)
   at javax.security.auth.Subject.doAs(Subject.java:396)
   at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
   at org.apache.hadoop.mapred.Child.main(Child.java:262)

Can someone tell me why hftp is failing to serve files, or at least where to look?



--
Harsh J



Mime
View raw message