hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Guillermo Ortiz <konstt2...@gmail.com>
Subject Error loading SHA-1 keys with load bulk
Date Thu, 01 May 2014 12:38:34 GMT
I have been looking at the code in HBase, but, I don't really understand
what this error happens. Why can I put in HBase those keys?

2014-04-30 17:57 GMT+02:00 Guillermo Ortiz

> I'm using HBase with MapReduce to load a lot of data, so I have decide to
> do it with bulk load.
> I parse my keys with SHA1, but when I try to load them, I got this
> exception.
> java.io.IOException: Added a key not lexically larger than previous key=\x00(6e9e59f36a7ec2ac54635b2d353e53e677839046\x01l\x00\x00\x01E\xB3>\xC9\xC7\x0E,
> 	at org.apache.hadoop.hbase.io.hfile.AbstractHFileWriter.checkKey(AbstractHFileWriter.java:207)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.append(HFileWriterV2.java:324)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.append(HFileWriterV2.java:289)
> 	at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.append(StoreFile.java:1206)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.write(HFileOutputFormat.java:168)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.write(HFileOutputFormat.java:124)
> 	at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:551)
> 	at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
> I work with HBase 0.94.6. I have been loking for if I could define any reducer, since,
I have defined no one. I have read something about KeyValueSortReducer but, I don'tknow if
there's something that extends TableReducer or I'm lookging for a wrong way.

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message