hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Geoff Hendrey" <ghend...@decarta.com>
Subject RE: bulk loader question
Date Sun, 24 Apr 2011 06:43:21 GMT
Sorry guys - I found the answer to my own question. I was bulk loading a
sequence file full of Puts, and the version in each put was set to -1
accidentally. Apparently this caused the bulk load to not work. I
confirmed it works after I fixed the version number in the puts. 

-----Original Message-----
From: Geoff Hendrey [mailto:ghendrey@decarta.com] 
Sent: Saturday, April 23, 2011 7:39 PM
To: hbase-user@hadoop.apache.org
Subject: bulk loader question

I am running the bulk loader of HBase 90.1  per: "$ hadoop jar
hbase-VERSION.jar completebulkload /user/todd/myoutput mytable"

 

I see the expected "Trying to load HFile ..." for each of the bulkload
HFiles. However, the table is inaccesible after the bulk laod process
runs. For instance, in the HBase shell, scan "return 0 rows" after an
inordinate pause (sometimes nearly a minute). Executing a count from
hbase shell exhibits similar behavior (long pause followed by "0
row(s)".

 

Is there any way I can get more information from the bulk loader, or do
some kind of manual inspection of HBase's files so I can determine why
my data appears to load without error, but then cannot be accessed?

 

Thanks,

geoff


Mime
View raw message