hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Prasad Chakka (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-5887) Sqoop should create tables in Hive metastore after importing to HDFS
Date Fri, 22 May 2009 00:23:45 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-5887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12711877#action_12711877
] 

Prasad Chakka commented on HADOOP-5887:
---------------------------------------

use external if the data is not movable or if data has to reside in non-default file system
(NFS mounted or a different HDFS or S3 etc). in any other case, use internal tables.

> Sqoop should create tables in Hive metastore after importing to HDFS
> --------------------------------------------------------------------
>
>                 Key: HADOOP-5887
>                 URL: https://issues.apache.org/jira/browse/HADOOP-5887
>             Project: Hadoop Core
>          Issue Type: New Feature
>            Reporter: Aaron Kimball
>            Assignee: Aaron Kimball
>         Attachments: HADOOP-5887.patch
>
>
> Sqoop (HADOOP-5815) imports tables into HDFS; it is a straightforward enhancement to
then generate a Hive DDL statement to recreate the table definition in the Hive metastore
and move the imported table into the Hive warehouse directory from its upload target.
> This feature enhancement makes this process automatic. An import is performed with sqoop
in the usual way; providing the argument "--hive-import" will cause it to then issue a CREATE
TABLE .. LOAD DATA INTO statement to a Hive shell. It generates a script file and then attempts
to run "$HIVE_HOME/bin/hive" on it, or failing that, any "hive" on the $PATH; $HIVE_HOME can
be overridden with --hive-home. As a result, no direct linking against Hive is necessary.
> The unit tests provided with this enhancement use a mock implementation of 'bin/hive'
that compares the script it's fed with one from a directory full of "expected" scripts. The
exact script file referenced is controlled via an environment variable. It doesn't actually
load into a proper Hive metastore, but manual testing has shown that this process works in
practice, so the mock implementation is a reasonable unit testing tool.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message