hadoop-hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Johan Oskarsson (JIRA)" <j...@apache.org>
Subject [jira] Created: (HIVE-217) Stream closed exception
Date Wed, 07 Jan 2009 19:06:44 GMT
Stream closed exception

                 Key: HIVE-217
                 URL: https://issues.apache.org/jira/browse/HIVE-217
             Project: Hadoop Hive
          Issue Type: Bug
          Components: Serializers/Deserializers
         Environment: Hive from trunk, hadoop 0.18.2, ~20 machines
            Reporter: Johan Oskarsson
            Priority: Critical

When running a query similar to the following:
"insert overwrite table outputtable select a, b, cast(sum(counter) as INT) from tablea join
tableb on (tablea.username=tableb.username) join tablec on (tablec.userid = tablea.userid)
join tabled on (tablec.id=tabled.id) where insertdate >= 'somedate' and insertdate <=
'someotherdate' group by a, b;"
Where one table is ~40gb or so and the others are a couple of hundred mb. The error happens
in the first mapred job that processes the 40gb.

I get the following exception (see attached file for full stack trace):
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: Stream closed.
        at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:162)

It happens in one reduce task and is reproducible, running the same query gives the error.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message