incubator-hcatalog-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Capwell <dcapw...@gmail.com>
Subject Pig creates files on HDFS but partitions not getting added
Date Fri, 06 Apr 2012 20:21:01 GMT
I am using trunk and it seems that pig is not updating partitions for me.
 Is anyone else seeing pig scripts not updated the partitions?  Below is
the following commands run.

CREATE TABLE tmp_table (
   data string
)
partitioned by (
   datestamp string
  ,srcid string
  ,action string
  ,testid string
)
stored as rcfile
location '/tmp/hcat_tmp_tables/tmp_table';

The end of my pigscript looks like this:

store b into 'default.tmp_table' using
org.apache.hcatalog.pig.HCatStorer('datestamp=20091103,srcid=19174,action=click,testid=NOTESTID');


On HDFS I see the following:

-rw-rw-r--   3 dcapwell hdfs   13990632 2012-04-06
17:32/tmp/hcat_tmp_tables/tmp_table/action=click/datestamp=20091103/srcid=19174/testid=NOTESTID/part-m-00000


When I check the partitions in Hive I see the following:
./bin/hive -e "show partitions
tmp_table"
OK
Time taken: 1.461 seconds

So it looks like the pig job finishes just fine, writes the files to HDFS but
the partitions are not getting updated?


thanks for your time reading this email.

Mime
View raw message