hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Björn-Elmar Macek <ma...@cs.uni-kassel.de>
Subject HDFS "file" missing a part-file
Date Mon, 01 Oct 2012 16:11:28 GMT

i am kind of unsure where to post this problem, but i think it is more 
related to hadoop than to pig.

By successfully executing a pig script i created a new file in my hdfs. 
Sadly though, i cannot use it for further processing except for 
"dump"ing and viewing the data: every data-manipulation script-command 
just as "foreach" gives exceptions during the map phase.
Since there was no problem executing the same script on the first 100 
lines of my data (LIMIT statement),i copied it to my local fs folder.
What i realized is, that one of the files namely part-r-000001 was empty 
and contained within the _temporary folder.

Is there any reason for this? How can i fix this issue? Did the job 
(which created the file we are talking about) NOT run properly til its 
end, although the tasktracker worked til the very end and the file was 

Best regards,

View raw message