It seems that maybe the previous pig script didn't generate the output data or write correctly on hdfs. Can you provide the pig script you are trying to run?  Also, for the original script that ran and generated the file, can you verify if that job had any failed tasks?

On Mon, Oct 1, 2012 at 10:31 AM, Björn-Elmar Macek <> wrote:

Hi Robert,

the exception i see in the output of the grunt shell and in the pig log respectively is:

Backend error message
        at java.util.Stack.peek(
        at org.apache.pig.builtin.Utf8StorageConverter.consumeTuple(
        at org.apache.pig.builtin.Utf8StorageConverter.bytesToTuple(
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POCast.getNext(
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.getNext(
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.processPlan(
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNext(
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(
        at org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNext(
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.runPipeline(
        at org.apache.hadoop.mapred.MapTask.runNewMapper(
        at org.apache.hadoop.mapred.Child$
        at Method)
        at org.apache.hadoop.mapred.Child.main(

On Mon, 1 Oct 2012 10:12:22 -0700, Robert Molina <> wrote:
Hi Bjorn, 
Can you post the exception you are getting during the map phase?

On Mon, Oct 1, 2012 at 9:11 AM, Björn-Elmar Macek  wrote:


 i am kind of unsure where to post this problem, but i think it is
more related to hadoop than to pig.

 By successfully executing a pig script i created a new file in my
hdfs. Sadly though, i cannot use it for further processing except for
"dump"ing and viewing the data: every data-manipulation script-command
just as "foreach" gives exceptions during the map phase.
 Since there was no problem executing the same script on the first 100
lines of my data (LIMIT statement),i copied it to my local fs folder.
 What i realized is, that one of the files namely part-r-000001 was
empty and contained within the _temporary folder.

 Is there any reason for this? How can i fix this issue? Did the job
(which created the file we are talking about) NOT run properly til its
end, although the tasktracker worked til the very end and the file was

 Best regards,