hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ameet chaubal <>
Subject Re: large sql file creating large num of columns
Date Mon, 16 Jan 2012 15:14:10 GMT

this is an external table; so at the DDL stage, there is no data loading that is happening.
All that hive is supposed to do is to load the definition into mysql, right? Are you suggesting
that it's reading the datafile in HDFS? That should not be happening since the "external table"
does not need the data to be present, right?



 From: Edward Capriolo <>
To:; ameet chaubal <> 
Sent: Monday, January 16, 2012 10:06 AM
Subject: Re: large sql file creating large num of columns

I highly doubt this will work. I think that many things in hadoop and hive will try to buffer
an entire row so even if you make it past the metastore I do not think it will be of any use. 

On Mon, Jan 16, 2012 at 9:42 AM, ameet chaubal <> wrote:

Hi All,
>I have a SQL file of size 30mb which is a single create table statement with about 800,000
columns, hence the size. 
>I am trying to execute it using hive -f <file>. Initially, hive ran the command
with 256mb heap size and gave me an OOM error. I increased the heap size using export HADOOP_HEAPSIZE
to 1 gb and eventually 2gb which made the OOM error go away. However, the hive command ran
for 5 hours without actually creating the table. The JVM was running.
>1. running a strace on the process showed that it was stuck on a futex call.
>2. I am using mysql for metastore and there were no rows added to either TBLS or COLUMNS
>1. can hive do this create table of 800k columns from a sql file of 30mb?
>2. if theoretically possible, what could be happening that's taking it over 5 hours and
still not succeeding?
>any insight is much appreciated.
View raw message