hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From yongqiang he <heyongqiang...@gmail.com>
Subject Re: Can compression be used with ColumnarSerDe ?
Date Mon, 24 Jan 2011 21:14:45 GMT
How did you upload the data to the new table?
You can get the data compressed by doing a insert overwrite to the
destination table with setting "hive.exec.compress.output" to true.

Thanks
Yongqiang
On Mon, Jan 24, 2011 at 12:30 PM, Edward Capriolo <edlinuxguru@gmail.com> wrote:
> I am trying to explore some use case that I believe are perfect for
> the columnarSerDe, tables with 100+ columns where only one or two are
> selected in a particular query.
>
> CREATE TABLE (....)
> ROW FORMAT SERDE "org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"
>   STORED AS RCFile ;
>
> My issue is my data from our source table, with gzip sequence files,
> is much smaller then the ColumnarSerDe table and as a result any
> performance gains are lost.
>
> Any ideas?
>
> Thank you,
> Edward
>

Mime
View raw message