hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Szehon Ho (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-7220) Empty dir in external table causes issue (root_dir_external_table.q failure)
Date Fri, 13 Jun 2014 04:15:02 GMT

    [ https://issues.apache.org/jira/browse/HIVE-7220?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14030248#comment-14030248
] 

Szehon Ho commented on HIVE-7220:
---------------------------------

Yea, but the test did catch a real issue (a folder in any external table directory causes
error). I guess its best if Hadoop reverts back to old behavior (in 2.5?).  But this patch
should fix it on the Hive side, as another option, either way.

> Empty dir in external table causes issue (root_dir_external_table.q failure)
> ----------------------------------------------------------------------------
>
>                 Key: HIVE-7220
>                 URL: https://issues.apache.org/jira/browse/HIVE-7220
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Szehon Ho
>            Assignee: Szehon Ho
>         Attachments: HIVE-7220.patch
>
>
> While looking at root_dir_external_table.q failure, which is doing a query on an external
table located at root ('/'), I noticed that latest Hadoop2 CombineFileInputFormat returns
split representing empty directories (like '/Users'), which leads to failure in Hive's CombineFileRecordReader
as it tries to open the directory for processing.
> Tried with an external table in a normal HDFS directory, and it also returns the same
error.  Looks like a real bug.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message