hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Daniel Cryans <jdcry...@apache.org>
Subject Re: LoadIncrementalHFile doesn't check hfile families?
Date Mon, 01 Aug 2011 22:06:32 GMT
Hi David,

I agree, that should be checked if even just for the sake of
completeness. Can you please open a jira?

Thanks,

J-D

On Wed, Jul 27, 2011 at 6:02 PM, David Capwell <dcapwell@yahoo-inc.com> wrote:
> Heya, I am testing hbase with bulk loads and I seeing something unexpected.
>
> I'm generating a set of random KeyValues where key, family, column, and value are all
random strings, i then sort them as Arrays.sort(values, KeyValue.COMPARATOR);
> I wrote this list to a StoreFile.Writer under /tmp/$tableName/$family and tried to load
it into HBase
>
> LoadIncrementalHFiles loadIncrementalHFiles = new LoadIncrementalHFiles(this.conf);
> loadIncrementalHFiles.doBulkLoad(path, table); // path is /tmp/$tableName
>
> The store file was not rejected and when I scan the table later with just the family,
it seems to return all the randomly generated values (even though they don't belong to this
family)
> Scan scan = new Scan();
> scan.addFamily(this.family);
> ResultScanner scanner = table.getScanner(scan);
> for(Result r : scanner) {
>   for(KeyValue kv: r.list()) {
>       KeyValueUtil.print(kv);
>   }
> }
>
> Based off http://hbase.apache.org/bulk-loads.html LoadIncrementalHFile will determine
which region the HFile belongs to; should it also check that the hfile belongs to the right
family?
>
> I am running hbase-0.90.3 on 10 nodes and hadoop-0.20.204.1 on the same 10 nodes
>
> Thanks for your time reading this email.

Mime
View raw message