hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ramkrishna vasudevan <ramkrishna.s.vasude...@gmail.com>
Subject Re: Concurrently Reading Still Got Exceptions
Date Sat, 02 Mar 2013 17:48:36 GMT
This could be a case where the RPC server response got closed and we try to
act on that which got nullified.  Remember some similar issues fixed on
this part.  Don't remember whether it is 0.92 version.

On Sat, Mar 2, 2013 at 11:14 PM, Anoop John <anoop.hbase@gmail.com> wrote:

> Is this really related to concurrent reads?  I think some thing else..
> Will dig into code tomorrow.  Can you attach a junit test case which will
> produce NPE.
>
> -Anoop-
>
>
> On Sat, Mar 2, 2013 at 9:29 PM, Ted Yu <yuzhihong@gmail.com> wrote:
>
> > Looks like the issue might be related to HTable:
> >
> >           at org.apache.hadoop.hbase.client.HTable$ClientScanner.
> > nextScanner(HTable.java:1167)
> >           at org.apache.hadoop.hbase.client.HTable$ClientScanner.
> > next(HTable.java:1296)
> >           at org.apache.hadoop.hbase.client.HTable$ClientScanner$1.
> > hasNext(HTable.java:1356)
> >
> > In newer version of HBase (0.94), you can pass executor to HTable ctor so
> > that you don't need to use HTablePool:
> >
> >   public HTable(Configuration conf, final byte[] tableName,
> > finalExecutorService pool)
> >
> > Cheers
> >
> > On Wed, Feb 6, 2013 at 2:27 AM, Bing Li <lblabs@gmail.com> wrote:
> >
> > > Dear all,
> > >
> > > Some exceptions are raised when I concurrently read data from HBase.
> > > The version of HBase I used is 0.92.0.
> > >
> > > I cannot fix the problem. Could you please help me?
> > >
> > > Thanks so much!
> > >
> > > Best wishes,
> > > Bing
> > >
> > >       Feb 6, 2013 12:21:31 AM
> > > org.apache.hadoop.hbase.ipc.HBaseClient$Connection run
> > >       WARNING: Unexpected exception receiving call responses
> > > java.lang.NullPointerException
> > >           at
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:297)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)
> > >       Feb 6, 2013 12:21:31 AM
> > > org.apache.hadoop.hbase.client.ScannerCallable close
> > >       WARNING: Ignore, probably already closed
> > >       java.io.IOException: Call to greatfreeweb/127.0.1.1:60020
> > > failed on local exception: java.io.IOException: Unexpected exception
> > > receiving call responses
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:934)
> > >           at
> > > org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:903)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
> > >           at $Proxy6.close(Unknown Source)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.close(ScannerCallable.java:112)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:74)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:39)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1325)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HTable$ClientScanner.nextScanner(HTable.java:1167)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HTable$ClientScanner.next(HTable.java:1296)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HTable$ClientScanner$1.hasNext(HTable.java:1356)
> > >           at
> > >
> >
> com.greatfree.hbase.rank.NodeRankRetriever.loadNodeGroupNodeRankRowKeys(NodeRankRetriever.java:348)
> > >           at
> > >
> >
> com.greatfree.ranking.PersistNodeGroupNodeRanksThread.run(PersistNodeGroupNodeRanksThread.java:29)
> > >           at
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > >           at
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > >           at java.lang.Thread.run(Thread.java:662) Caused by:
> > > java.io.IOException: Unexpected exception receiving call responses
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:509)
> > >       Caused by: java.lang.NullPointerException
> > >           at
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:297)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)
> > >
> > > I read data from HBase concurrently with the following code.
> > >
> > >         ...
> > >                 ExecutorService threadPool =
> > > Executors.newFixedThreadPool(100);
> > >                 LoadNodeGroupNodeRankRowKeyThread thread;
> > >                 Set<String> groupKeys;
> > >                 for (String nodeKey : nodeKeys)
> > >                 {
> > >                         groupKeys =
> > NodeCache.WWW().getGroupKeys(nodeKey);
> > >                         for (String groupKey : groupKeys)
> > >                         {
> > >                                 // Threads are initialized and executed
> > > here.
> > >                                 thread = new
> > > LoadNodeGroupNodeRankRowKeyThread(nodeKey, groupKey,
> > > TimingScale.PERMANENTLY);
> > >                                 threadPool.execute(thread);
> > >                         }
> > >                 }
> > >                 Scanner in = new Scanner(System.in);
> > >                 in.nextLine();
> > >                 threadPool.shutdownNow();
> > >         ...
> > >
> > > The code of LoadNodeGroupNodeRankRowKeyThread is as follows,
> > >
> > >         ...
> > >         public void run()
> > >         {
> > >                 NodeRankRetriever retriever = new NodeRankRetriever();
> > >                 Set<String> rowKeys =
> > > retriever.loadNodeGroupNodeRankRowKeys(this.hostNodeKey,
> > > this.groupKey, this.timingScale);
> > >                 if (rowKeys.size() > 0)
> > >                 {
> > >                         for (String rowKey : rowKeys)
> > >                         {
> > >                                 System.out.println(rowKey);
> > >                         }
> > >                 }
> > >                 else
> > >                 {
> > >                         System.out.println("No data loaded");
> > >                 }
> > >                 retriever.dispose();
> > >         }
> > >         ...
> > >
> > > The constructor of NodeRankRetriever() just got an instance of HTable
> > > from HTablePool from the following method.
> > >
> > >         ...
> > >         public HTableInterface getTable(String tableName)
> > >         {
> > >                 return this.hTablePool.getTable(tableName);
> > >         }
> > >         ...
> > >
> > > The method dispose() of NodeRankRetriever() just close the
> > > HTableInterface created by HTablePool.
> > >
> > >         ...
> > >         public void dispose()
> > >         {
> > >                 try
> > >                 {
> > >                         this.rankTable.close();
> > >                 }
> > >                 catch (IOException e)
> > >                 {
> > >                         e.printStackTrace();
> > >                 }
> > >         }
> > >         ...
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message