hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John <johnnyenglish...@gmail.com>
Subject Bulkload Problem
Date Sun, 20 Oct 2013 10:56:31 GMT

I try to load a big amount of data into a hbase cluster. I've imported
successfully up to 3000 Millionen Datasets (KV Pairs). But if I try to
import 6000 Millionen I got this error after 60-95% of the import:
http://pastebin.com/CCp6kS3m ...

The System is not crashing or anything like this, All nodes are still up.
It seems to me that one node is temporarily not available. Maybe is it
possibel to increase the repeat-number? (I think its default 10). What
value do I have to change for that?

I'm using Cloudera 4.4.0-1 and the Hbase version 0.94.6-cdh4.4.0



  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message