Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 592099D4A for ; Sun, 1 Jan 2012 17:40:59 +0000 (UTC) Received: (qmail 18792 invoked by uid 500); 1 Jan 2012 17:40:57 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 18752 invoked by uid 500); 1 Jan 2012 17:40:56 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 18744 invoked by uid 99); 1 Jan 2012 17:40:56 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 01 Jan 2012 17:40:56 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of royston.sellman@googlemail.com designates 74.125.82.51 as permitted sender) Received: from [74.125.82.51] (HELO mail-ww0-f51.google.com) (74.125.82.51) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 01 Jan 2012 17:40:50 +0000 Received: by wgbdr1 with SMTP id dr1so18661957wgb.20 for ; Sun, 01 Jan 2012 09:40:29 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=googlemail.com; s=gamma; h=from:to:references:in-reply-to:subject:date:message-id:mime-version :content-type:content-transfer-encoding:x-mailer:thread-index :content-language; bh=ARqGltAC3LdE7KQcJey+YtQh1/wvNSmSsvLkVxHeImQ=; b=I11PC1lAN9qUs6O+hhh58DVogkRDrjs5eZ/95YW6gke2FRI8BkKE3XMKyMhaEPPh9J W1X0XUX1LRY93inLgJq4JsILMAE3PJb9sN2AX/88d+a3rJD2qc0RnQT24Dnbn8SFXUMH wqN8oTQnDGURwE+dzTNBErswY5Sq1X7OA3cyU= Received: by 10.227.206.205 with SMTP id fv13mr56996383wbb.17.1325439629128; Sun, 01 Jan 2012 09:40:29 -0800 (PST) Received: from roystonsnb (host86-175-101-163.wlms-broadband.com. [86.175.101.163]) by mx.google.com with ESMTPS id di5sm110738472wib.3.2012.01.01.09.40.26 (version=TLSv1/SSLv3 cipher=OTHER); Sun, 01 Jan 2012 09:40:27 -0800 (PST) From: Royston Sellman To: References: In-Reply-To: Subject: RE: AggregateProtocol Help Date: Sun, 1 Jan 2012 17:35:47 -0000 Message-ID: <03de01ccc8ab$cd04abe0$670e03a0$@gmail.com> MIME-Version: 1.0 Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit X-Mailer: Microsoft Outlook 14.0 Thread-index: AQI4SbjZ0rDMLFqb59wQ8jt9AqCAAAGR+XBxAfjTaCcCOzS7vwI/QhDwlOCPvSA= Content-language: en-gb Hi Gary and Ted, Royston (Tom's colleague) here. Back onto this after the Christmas/New Year break. Many thanks for your help so far. We enabled our database via your hbase-site.xml mod and were able to move on. to other errors. But I think we are now actually getting an aggregation partially calculated on our table (this feels like progress). The details: On running our client we now get this exception: 11/12/31 17:51:09 WARN client.HConnectionManager$HConnectionImplementation: Error executing for row java.util.concurrent.ExecutionException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=10, exceptions: Sat Dec 31 17:41:30 GMT 2011, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1fc4f0f8, java.net.SocketTimeoutException: Call to namenode/10.0.0.235:60020 failed on socket timeout exception: java.net.SocketTimeoutException: 60000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.0.0.235:59999 remote=namenode/10.0.0.235:60020] (8 more of these, making for 10 tries) Sat Dec 31 17:51:09 GMT 2011, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1fc4f0f8, java.net.SocketTimeoutException: Call to namenode/10.0.0.235:60020 failed on socket timeout exception: java.net.SocketTimeoutException: 60000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/10.0.0.235:59364 remote=namenode/10.0.0.235:60020] at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222) at java.util.concurrent.FutureTask.get(FutureTask.java:83) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation. processExecs(HConnectionManager.java:1465) at org.apache.hadoop.hbase.client.HTable.coprocessorExec(HTable.java:1555) at org.apache.hadoop.hbase.client.coprocessor.AggregationClient.sum(Aggregation Client.java:229) at EDRPAggregator.testSumWithValidRange(EDRPAggregator.java:51) at EDRPAggregator.main(EDRPAggregator.java:77) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39 ) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl .java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) Looking at the log (.regionserver-namenode.log) I see this debug message: 2011-12-31 17:42:23,472 DEBUG org.apache.hadoop.hbase.coprocessor.AggregateImplementation: Sum from this region is EDRPTestTbl,,1324485124322.7b9ee0d113db9b24ea9fdde90702d006.: 0 Where the sum value looks reasonable which makes me think the sum of a CF:CQ worked. But I never see this value on stdout. Then I see this warning: 2011-12-31 17:42:23,476 WARN org.apache.hadoop.ipc.HBaseServer: (responseTooSlow): {"processingtimems":113146,"call":"execCoprocess$ 2011-12-31 17:42:23,511 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server Responder, call execCoprocessor([B@4b22fad6, getSum(org.$ 2011-12-31 17:42:23,515 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server handler 1 on 60020 caught: java.nio.channels.ClosedChann$ at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:133) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:324) at org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1651) at org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServe r.java:924) at org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java :1003) at org.apache.hadoop.hbase.ipc.HBaseServer$Call.sendResponseIfReady(HBaseServer .java:409) at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1345) Have we missed out some step in the HBase/RegionServerconfig? Or is our client code still deficient? Can you offer any suggestions? Is there any example code for the new Aggregations stuff. Thanks and Happy New Year to you guys, Royston (and Tom). (HBase 0.92, Hadoop 1.0) -----Original Message----- From: Gary Helmling [mailto:ghelmling@gmail.com] Sent: 23 December 2011 18:06 To: user@hbase.apache.org Subject: Re: AggregateProtocol Help Hi Tom, The test code is not really the best guide for configuration. To enable the AggregateProtocol on all of your tables, add this to the hbase-site.xml for the servers in your cluster: hbase.coprocessor.user.region.classes org.apache.hadoop.hbase.coprocessor.AggregateImplementation If you only want to use the aggregate functions on a specific table (or tables), then you can enable that individually for the table from the shell: 1) disable the table hbase> disable 'EDRP7' 2) add the coprocessor hbase> alter 'EDRP7', METHOD => 'table_att', 'coprocessor'=>'|org.apache.hadoop.hbase.coprocessor.AggregateImplementation ||' (Note that the pipes in the value string are required) 3) re-enable the table hbase> enable 'EDRP7' Either way should work. With the second approach you will see the coprocessor listed when you describe the table from the shell, as Ted mentioned. With the first approach you will not, but it should be loaded all the same. --gh On Fri, Dec 23, 2011 at 7:04 AM, Ted Yu wrote: > I don't know why you chose HBaseTestingUtility to create the table. > I guess you followed test code example. > > At least you should pass the conf to this ctor: > public HBaseTestingUtility(Configuration conf) { > > If coprocessor was installed correctly, you should see something > like(from > HBASE-5070): > coprocessor$1 => > '|org.apache.hadoop.hbase.constraint.ConstraintProcessor|1073741823|' > > Cheers > > On Fri, Dec 23, 2011 at 3:02 AM, Tom Wilcox wrote: > >> Hi, >> >> I am not sure how we load the AggregateImplementation into the table. >> When we are creating a table, we use the same functions as the test as follows... >> >> ... >> > conf.set(CoprocessorHost.REGION_COPROCESSOR_CONF_KEY, >> > >> > "org.apache.hadoop.hbase.coprocessor.AggregateImplementation"); >> > >> > // Utility.CreateHBaseTable(conf, otherArgs[1], >> otherArgs[2], >> > true); >> > >> > HBaseTestingUtility util = new HBaseTestingUtility(); >> > HTable table = util.createTable(EDRP_TABLE, >> > EDRP_FAMILY); >> > >> > AggregationClient aClient = new >> > AggregationClient(conf); >> ... >> >> Running DESCRIBE on a table produced shows the following output: >> >> hbase(main):002:0> describe 'EDRP7' >> DESCRIPTION >> ENABLED >> {NAME => 'EDRP7', FAMILIES => [{NAME => 'advanceKWh', BLOOMFILTER => >> 'NONE', REPLICATION_SCOPE => '0', VERSIONS => true >> '3', COMPRESSION => 'NONE', MIN_VERSIONS => '0', TTL => >> '2147483647', BLOCKSIZE => '65536', IN_MEMORY => 'false', B >> LOCKCACHE => 'true'}]} >> >> We are using the tip of 0.92 (cloned from the Git repo). See the >> version string below: >> >> hbase(main):005:0> version >> 0.92.0, r1208286, Thu Dec 15 13:16:03 GMT 2011 >> >> We would really appreciate an example of how to create a table that >> is enabled to handle Aggregation). >> >> Thanks >> >> >> ________________________________________ >> From: Ted Yu [yuzhihong@gmail.com] >> Sent: 22 December 2011 17:03 >> To: user@hbase.apache.org >> Subject: Re: AggregateProtocol Help >> >> Have you loaded AggregateImplementation into your table ? >> Can you show us the contents of the following command in hbase shell: >> describe 'your-table' >> >> BTW are you using the tip of 0.92 ? >> HBASE-4946 would be of help for dynamically loaded coprocessors which >> you might use in the future. >> >> Cheers >> >> On Thu, Dec 22, 2011 at 8:09 AM, Tom Wilcox wrote: >> >> > Hi, >> > >> > We are trying to use the aggregation functionality in HBase 0.92 >> > and we have managed to get the test code working using the following command: >> > >> > java -classpath junit-4.10.jar:build/*:$HBASELIBS/* >> > org.junit.runner.JUnitCore >> > org.apache.hadoop.hbase.coprocessor.TestAggregateProtocol >> > >> > Closer inspection of this test class has revealed that it uses a >> > mini DFS cluster to populate and run the tests. These tests return successfully. >> > >> > However, when we attempt to run similar code on our development >> > HDFS cluster we experience the following error: >> > >> > 11/12/22 15:46:28 WARN >> > client.HConnectionManager$HConnectionImplementation: Error >> > executing for >> row >> > java.util.concurrent.ExecutionException: >> > org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: >> > org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: No >> matching >> > handler for protocol >> org.apache.hadoop.hbase.coprocessor.AggregateProtocol >> > in region EDRPTestTbl,,1324485124322.7b9ee0d113db9b24ea9fdde90702d006. >> > at >> > org.apache.hadoop.hbase.regionserver.HRegion.exec(HRegion.java:4010 >> > ) >> > at >> > >> org.apache.hadoop.hbase.regionserver.HRegionServer.execCoprocessor(HR >> egionServer.java:3040) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. >> java:39) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces >> sorImpl.java:25) >> > at java.lang.reflect.Method.invoke(Method.java:597) >> > at >> > >> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpc >> Engine.java:364) >> > at >> > >> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java: >> 1325) >> > [sshexec] >> > at >> > java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222) >> > at >> > java.util.concurrent.FutureTask.get(FutureTask.java:83) >> > at >> > >> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplemen >> tation.processExecs(HConnectionManager.java:1465) >> > at >> > org.apache.hadoop.hbase.client.HTable.coprocessorExec(HTable.java:1 >> > 555) >> > at >> > >> org.apache.hadoop.hbase.client.coprocessor.AggregationClient.sum(Aggr >> egationClient.java:229) >> > at >> > EDRPAggregator.testSumWithValidRange(EDRPAggregator.java:51) >> > at EDRPAggregator.main(EDRPAggregator.java:77) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. >> java:39) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces >> sorImpl.java:25) >> > at java.lang.reflect.Method.invoke(Method.java:597) >> > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) >> > Caused by: >> > org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: >> > org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: No >> matching >> > handler for protocol >> org.apache.hadoop.hbase.coprocessor.AggregateProtocol >> > in region EDRPTestTbl,,1324485124322.7b9ee0d113db9b24ea9fdde90702d006. >> > at >> > org.apache.hadoop.hbase.regionserver.HRegion.exec(HRegion.java:4010 >> > ) >> > at >> > >> org.apache.hadoop.hbase.regionserver.HRegionServer.execCoprocessor(HR >> egionServer.java:3040) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. >> java:39) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces >> sorImpl.java:25) >> > at java.lang.reflect.Method.invoke(Method.java:597) >> > at >> > >> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpc >> Engine.java:364) >> > at >> > >> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java: >> 1325) >> > [sshexec] >> > at >> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >> > Method) >> > at >> > >> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct >> orAccessorImpl.java:39) >> > at >> > >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC >> onstructorAccessorImpl.java:27) >> > at >> > java.lang.reflect.Constructor.newInstance(Constructor.java:513) >> > at >> > >> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException( >> RemoteExceptionHandler.java:96) >> > at >> > >> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplemen >> tation.translateException(HConnectionManager.java:1651) >> > at >> > >> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplemen >> tation.getRegionServerWithRetries(HConnectionManager.java:1327) >> > at >> > org.apache.hadoop.hbase.ipc.ExecRPCInvoker.invoke(ExecRPCInvoker.ja >> > va:79) >> > at $Proxy3.getSum(Unknown Source) >> > at >> > >> org.apache.hadoop.hbase.client.coprocessor.AggregationClient$4.call(A >> ggregationClient.java:233) >> > at >> > >> org.apache.hadoop.hbase.client.coprocessor.AggregationClient$4.call(A >> ggregationClient.java:230) >> > at >> > >> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplemen >> tation$4.call(HConnectionManager.java:1453) >> > at >> > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) >> > at >> > java.util.concurrent.FutureTask.run(FutureTask.java:138) >> > at >> > >> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec >> utor.java:886) >> > at >> > >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor >> .java:908) >> > at java.lang.Thread.run(Thread.java:662) >> > org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: >> > org.apache.hadoop.hbase.ipc.HBaseRPC$UnknownProtocolException: No >> matching >> > handler for protocol >> org.apache.hadoop.hbase.coprocessor.AggregateProtocol >> > in region EDRPTestTbl,,1324485124322.7b9ee0d113db9b24ea9fdde90702d006. >> > at >> > org.apache.hadoop.hbase.regionserver.HRegion.exec(HRegion.java:4010 >> > ) >> > at >> > >> org.apache.hadoop.hbase.regionserver.HRegionServer.execCoprocessor(HR >> egionServer.java:3040) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. >> java:39) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces >> sorImpl.java:25) >> > at java.lang.reflect.Method.invoke(Method.java:597) >> > at >> > >> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpc >> Engine.java:364) >> > at >> > >> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java: >> 1325) >> > [sshexec] >> > at >> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >> > Method) >> > at >> > >> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct >> orAccessorImpl.java:39) >> > at >> > >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC >> onstructorAccessorImpl.java:27) >> > at >> > java.lang.reflect.Constructor.newInstance(Constructor.java:513) >> > at >> > >> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException( >> RemoteExceptionHandler.java:96) >> > at >> > >> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplemen >> tation.translateException(HConnectionManager.java:1651) >> > at >> > >> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplemen >> tation.getRegionServerWithRetries(HConnectionManager.java:1327) >> > at >> > org.apache.hadoop.hbase.ipc.ExecRPCInvoker.invoke(ExecRPCInvoker.ja >> > va:79) >> > at $Proxy3.getSum(Unknown Source) >> > at >> > >> org.apache.hadoop.hbase.client.coprocessor.AggregationClient$4.call(A >> ggregationClient.java:233) >> > at >> > >> org.apache.hadoop.hbase.client.coprocessor.AggregationClient$4.call(A >> ggregationClient.java:230) >> > at >> > >> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplemen >> tation$4.call(HConnectionManager.java:1453) >> > at >> > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) >> > at >> > java.util.concurrent.FutureTask.run(FutureTask.java:138) >> > at >> > >> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec >> utor.java:886) >> > at >> > >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor >> .java:908) >> > at java.lang.Thread.run(Thread.java:662) >> > >> > The source for our class is: >> > >> > import static org.junit.Assert.assertEquals; >> > >> > import java.io.IOException; >> > >> > import org.apache.hadoop.conf.Configuration; >> > import org.apache.hadoop.hbase.HBaseConfiguration; >> > import org.apache.hadoop.hbase.HBaseTestingUtility; >> > import org.apache.hadoop.hbase.HConstants; >> > import org.apache.hadoop.hbase.HTableDescriptor; >> > import org.apache.hadoop.hbase.client.HTable; >> > import org.apache.hadoop.hbase.client.Put; >> > import org.apache.hadoop.hbase.client.Scan; >> > import >> > org.apache.hadoop.hbase.client.coprocessor.AggregationClient; >> > import >> > org.apache.hadoop.hbase.client.coprocessor.LongColumnInterpreter; >> > import org.apache.hadoop.hbase.util.Bytes; >> > import org.apache.hadoop.util.GenericOptionsParser; >> > import org.apache.hadoop.hbase.coprocessor.ColumnInterpreter; >> > import org.apache.hadoop.hbase.coprocessor.CoprocessorHost; >> > import org.junit.Test; >> > >> > public class EDRPAggregator { >> > >> > // private static final byte[] EDRP_FAMILY = >> > Bytes.toBytes("EDRP"); >> > // private static final byte[] EDRP_QUALIFIER = >> > Bytes.toBytes("advanceKWh"); >> > >> > private static byte[] ROW = Bytes.toBytes("testRow"); >> > private static final int ROWSIZE = 20; >> > private static byte[][] ROWS = makeN(ROW, ROWSIZE); >> > private static final byte[] TEST_QUALIFIER = >> > Bytes.toBytes("TestQualifier"); >> > private static final byte[] TEST_MULTI_CQ = >> > Bytes.toBytes("TestMultiCQ"); >> > private static final int rowSeperator1 = 5; >> > private static final int rowSeperator2 = 12; >> > >> > public static void testSumWithValidRange(Configuration conf, >> > String[] otherArgs) throws Throwable { >> > byte[] EDRP_TABLE = Bytes.toBytes(otherArgs[1]); >> > byte[] EDRP_FAMILY = Bytes.toBytes(otherArgs[2]); >> > >> > conf.set(CoprocessorHost.REGION_COPROCESSOR_CONF_KEY, >> > >> > "org.apache.hadoop.hbase.coprocessor.AggregateImplementation"); >> > >> > // Utility.CreateHBaseTable(conf, otherArgs[1], >> otherArgs[2], >> > true); >> > >> > HBaseTestingUtility util = new HBaseTestingUtility(); >> > HTable table = util.createTable(EDRP_TABLE, >> > EDRP_FAMILY); >> > >> > AggregationClient aClient = new >> > AggregationClient(conf); >> > Scan scan = new Scan(); >> > scan.addColumn(EDRP_TABLE, EDRP_FAMILY); >> > final ColumnInterpreter ci = new >> > LongColumnInterpreter(); >> > long sum = aClient.sum(Bytes.toBytes(otherArgs[0]), >> > ci, >> scan); >> > System.out.println(sum); >> > } >> > >> > /** >> > * Main entry point. >> > * >> > * @param argsThe >> > * command line parameters. >> > * @throws Exception >> > * When running the job fails. >> > */ >> > public static void main(String[] args) throws Exception { >> > Configuration conf = HBaseConfiguration.create(); >> > >> > String[] otherArgs = new GenericOptionsParser(conf, >> > args) >> > .getRemainingArgs(); >> > if (otherArgs.length != 3) { >> > System.err >> > .println("Wrong number of >> > arguments: " + otherArgs.length); >> > System.err.println("Usage: " + " >> > "); >> > System.exit(-1); >> > } >> > >> > try { >> > testSumWithValidRange(conf, otherArgs); >> > } catch (Throwable e) { >> > e.printStackTrace(); >> > } >> > } >> > >> > /** >> > * an infrastructure method to prepare rows for the testtable. >> > * >> > * @param base >> > * @param n >> > * @return >> > */ >> > private static byte[][] makeN(byte[] base, int n) { >> > byte[][] ret = new byte[n][]; >> > for (int i = 0; i < n; i++) { >> > ret[i] = Bytes.add(base, Bytes.toBytes(i)); >> > } >> > return ret; >> > } >> > } >> > >> > Please can you suggest what might be causing and/or how we might >> > fix this UnknownProtocolException? >> > >> > Also, does anyone have any working examples using the aggregation >> protocol >> > other than the test code? >> > >> > Thanks, >> > Tom >> > >> > >>