hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tim Robertson <>
Subject Writing HFiles using Hive for an HBase bulk load - possible bug?
Date Tue, 19 Apr 2016 18:43:05 GMT
Hi folks,

I am trying to create HFiles from a Hive table to bulk load into HBase and
am following the HWX [1] tutorial.

It creates the HFiles correctly but then fails when closing the
RecordWriter with the following stack trace.

Error: java.lang.RuntimeException: Hive Runtime Error while closing
operators: Multiple family directories found in hdfs://
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(
at org.apache.hadoop.mapred.YarnChild$
at Method)
at org.apache.hadoop.mapred.YarnChild.main(

The reason is that the HFiles are created in the task attempt folder but it
is looking 2 directories above the task attempt for the HFile.

Does anyone else see this behaviour please?

I logged it as a bug [2], which also details exactly my procedure, but I
wonder if someone could confirm that they also see this, or if perhaps I am
just doing something wrong and it works for them?

Thanks all,



View raw message