hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hadoop QA (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-5887) Sqoop should create tables in Hive metastore after importing to HDFS
Date Fri, 05 Jun 2009 22:12:07 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-5887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12716771#action_12716771

Hadoop QA commented on HADOOP-5887:

-1 overall.  Here are the results of testing the latest attachment 
  against trunk revision 782083.

    +1 @author.  The patch does not contain any @author tags.

    +1 tests included.  The patch appears to include 19 new or modified tests.

    +1 javadoc.  The javadoc tool did not generate any warning messages.

    +1 javac.  The applied patch does not increase the total number of javac compiler warnings.

    +1 findbugs.  The patch does not introduce any new Findbugs warnings.

    +1 Eclipse classpath. The patch retains Eclipse classpath integrity.

    -1 release audit.  The applied patch generated 496 release audit warnings (more than the
trunk's current 492 warnings).

    -1 core tests.  The patch failed core unit tests.

    -1 contrib tests.  The patch failed contrib unit tests.

Test results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/469/testReport/
Release audit warnings: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/469/artifact/trunk/patchprocess/releaseAuditDiffWarnings.txt
Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/469/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
Checkstyle results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/469/artifact/trunk/build/test/checkstyle-errors.html
Console output: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/469/console

This message is automatically generated.

> Sqoop should create tables in Hive metastore after importing to HDFS
> --------------------------------------------------------------------
>                 Key: HADOOP-5887
>                 URL: https://issues.apache.org/jira/browse/HADOOP-5887
>             Project: Hadoop Core
>          Issue Type: New Feature
>            Reporter: Aaron Kimball
>            Assignee: Aaron Kimball
>         Attachments: HADOOP-5887.2.patch, HADOOP-5887.patch
> Sqoop (HADOOP-5815) imports tables into HDFS; it is a straightforward enhancement to
then generate a Hive DDL statement to recreate the table definition in the Hive metastore
and move the imported table into the Hive warehouse directory from its upload target.
> This feature enhancement makes this process automatic. An import is performed with sqoop
in the usual way; providing the argument "--hive-import" will cause it to then issue a CREATE
TABLE .. LOAD DATA INTO statement to a Hive shell. It generates a script file and then attempts
to run "$HIVE_HOME/bin/hive" on it, or failing that, any "hive" on the $PATH; $HIVE_HOME can
be overridden with --hive-home. As a result, no direct linking against Hive is necessary.
> The unit tests provided with this enhancement use a mock implementation of 'bin/hive'
that compares the script it's fed with one from a directory full of "expected" scripts. The
exact script file referenced is controlled via an environment variable. It doesn't actually
load into a proper Hive metastore, but manual testing has shown that this process works in
practice, so the mock implementation is a reasonable unit testing tool.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message