hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Szymon Gwóźdź <>
Subject exporting partitioned data into remote database
Date Fri, 18 Jun 2010 12:52:29 GMT

I have table tb1 defined by:
CREATE TABLE tb1(user int, counter int) PARTITIONED BY (day string) 

I want to export data from this table into mysql table defined by:
CREATE TABLE tb2(user int, counter int, day string)

I've tried to use Sqoop in order to do this but Sqoop doesn't allow to 
export directory with partitions - while trying:
sqoop --connect jdbc:mysql://test-db.gadu/crunchers --username crunchers 
--password RydBert3 --table tb2 --export-dir /user/hive/warehouse/tb1
I get:
10/06/18 13:55:03 WARN mapreduce.ExportJob: IOException checking 
SequenceFile header: Cannot open filename 
It is possible to do something like this:
sqoop --connect jdbc:mysql://test-db.gadu/crunchers --username crunchers 
--password RydBert3 --table tb3 --export-dir 
using mysql table defined by:
CREATE TABLE tb3(user int, counter int),
but it is not the thing I want to do, because I want to have "day" 
column in mysql table.

Does someone know what to do in order to load this data into mysql?

Szymon Gwóźdź

View raw message