Return-Path: Delivered-To: apmail-hadoop-hbase-user-archive@minotaur.apache.org Received: (qmail 37175 invoked from network); 18 May 2010 16:26:24 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 18 May 2010 16:26:24 -0000 Received: (qmail 92981 invoked by uid 500); 18 May 2010 16:26:23 -0000 Delivered-To: apmail-hadoop-hbase-user-archive@hadoop.apache.org Received: (qmail 92880 invoked by uid 500); 18 May 2010 16:26:23 -0000 Mailing-List: contact hbase-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hbase-user@hadoop.apache.org Delivered-To: mailing list hbase-user@hadoop.apache.org Received: (qmail 92872 invoked by uid 99); 18 May 2010 16:26:23 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 18 May 2010 16:26:23 +0000 X-ASF-Spam-Status: No, hits=-0.0 required=10.0 tests=SPF_HELO_PASS,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of patrick.datko@ymc.ch designates 80.82.208.191 as permitted sender) Received: from [80.82.208.191] (HELO mx1.ymc.ch) (80.82.208.191) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 18 May 2010 16:26:14 +0000 Received: from [10.10.1.42] (84-72-85-88.dclient.hispeed.ch [84.72.85.88]) by mx1.ymc.ch (Postfix) with ESMTPSA id 27CF43C0C for ; Tue, 18 May 2010 18:25:45 +0200 (CEST) Subject: HBase MR with Filter From: Patrick Datko Reply-To: patrick.datko@ymc.ch To: hbase-user@hadoop.apache.org Content-Type: text/plain Organization: YMC AG Date: Tue, 18 May 2010 18:25:46 +0200 Message-Id: <1274199946.10887.12.camel@station4.hq> Mime-Version: 1.0 X-Mailer: Evolution 2.22.3.1 Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org Hey, i'm building a Map Reduce Job which should get Data from a HBase Table filter it an store the reduced data in another HBase Table. I used the SingleColumnValueFilter to limit the Data that will be commit to the Map-Process. The Problem is, the Filter doesn't reduce the Data but commit all Data in the Table to the Map Process: The code looks like this for the Filter: Scan scan = new Scan(); String columns = "details"; String qualifier = "details:page"; String value = "5"; scan.setFilter(new SingleColumnValueFilter(Bytes.toBytes(columns), Bytes.toBytes(qualifier), CompareOp.EQUAL, Bytes.toBytes("5"))); TableMapReduceUtil.initTableMapperJob("books", scan, mapper.class, ImmutableBytesWritable.class, IntWritable.class, job); And this was my Code for filling the Table: Put put = new Put(rowkey); put.add(Bytes.toBytes("details"), Bytes.toBytes("page"), Bytes.toBytes(rand.nextInt(20))); and I didn't understand why the filter doesn't work! I hope anybody can help me. Best regards, Patrick