hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ramkrishna vasudevan <ramkrishna.s.vasude...@gmail.com>
Subject Re: Error about rs block seek
Date Mon, 13 May 2013 08:34:23 GMT
Is it possible to reproduce this with simple test case based on your
usecase and data? You can share it so that can really debug the actual
problem.
Regards
Ram


On Mon, May 13, 2013 at 1:57 PM, Anoop John <anoop.hbase@gmail.com> wrote:

> >So I want to know if I set block size at the beginning of creating tables,
> does something make troubles?
>
> Should not. We have tested with diff block sizes from def 64K to 8K fro
> testing purposes.  Have not came across issues like this.  Only on this
> data it is coming or every time u create a new table with 32K as block size
> and do some writes and then do read, this issue comes?
>
> -Anoop-
>
> On Mon, May 13, 2013 at 1:36 PM, Bing Jiang <jiangbinglover@gmail.com
> >wrote:
>
> > hi, Anoop.
> > I do not handle or change the hbase checksum.
> >
> > So I want to know if I set block size at the beginning of creating
> tables,
> > does something make troubles?
> >
> >
> > 2013/5/13 Anoop John <anoop.hbase@gmail.com>
> >
> > > > Current pos = 32651;
> > > currKeyLen = 45; currValLen = 80; block limit = 32775
> > >
> > > This means after the cur position we need to have atleast  45+80+4(key
> > > length stored as 4 bytes) +4(value length 4 bytes)
> > > So atleast 32784 should have been the limit.  If we have memstoreTS
> also
> > > written with this KV some more bytes..
> > >
> > > Do u use Hbase handled checksum?
> > >
> > > -Anoop-
> > >
> > > On Mon, May 13, 2013 at 12:00 PM, Bing Jiang <jiangbinglover@gmail.com
> > > >wrote:
> > >
> > > > Hi,all
> > > > Before the exception stack, there is an Error log:
> > > > 2013-05-13 00:00:14,491 ERROR
> > > > org.apache.hadoop.hbase.io.hfile.HFileReaderV2: Current pos = 32651;
> > > > currKeyLen = 45; currValLen = 80; block limit = 32775; HFile name =
> > > > 1f96183d55144c058fa2a05fe5c0b814; currBlock currBlockOffset =
> 33550830
> > > >
> > > > And the operation is scanner's next.
> > > > Current pos + currKeyLen + currValLen > block limit
> > > > 32651+45 +80 = 32776 > 32775 , and in my table configs, set blocksize
> > > > 32768, and when I change the value from blocksize from 64k(default
> > value)
> > > > to 32k, so many error logs being found.
> > > >
> > > > I use 0.94.3, can someone tell me the influence of blocksize setting.
> > > >
> > > > Tks.
> > > >
> > > >
> > > >
> > > >
> > > > 2013/5/13 ramkrishna vasudevan <ramkrishna.s.vasudevan@gmail.com>
> > > >
> > > > > Your TTL is negative here 'TTL => '-1','.
> > > > >
> > > > > Any reason for it to be negative? This could be a possible reason.
> >  Not
> > > > > sure..
> > > > >
> > > > > Regards
> > > > > Ram
> > > > >
> > > > >
> > > > > On Mon, May 13, 2013 at 7:20 AM, Bing Jiang <
> > jiangbinglover@gmail.com
> > > > > >wrote:
> > > > >
> > > > > > hi, Ted.
> > > > > >
> > > > > > No data block encoding, our table config below:
> > > > > >
> > > > > > User Table Description
> > > > > > CrawlInfo<http://10.100.12.33:8003/table.jsp?name=CrawlInfo>
> {NAME
> > > > > > => 'CrawlInfo', DEFERRED_LOG_FLUSH => 'true', MAX_FILESIZE
=>
> > > > > > '34359738368', FAMILIES => [{NAME => 'CrawlStats', BLOOMFILTER
=>
> > > > > 'ROWCOL',
> > > > > > CACHE_INDEX_ON_WRITE => 'true', TTL => '-1', CACHE_DATA_ON_WRITE
> =>
> > > > > 'true',
> > > > > > CACHE_BLOOMS_ON_WRITE => 'true', VERSIONS => '1', BLOCKSIZE
=>
> > > > '32768'}]}
> > > > > >
> > > > > >
> > > > > >
> > > > > > 2013/5/13 Bing Jiang <jiangbinglover@gmail.com>
> > > > > >
> > > > > > > Hi, JM.
> > > > > > > Our jdk version is 1.6.0_38
> > > > > > >
> > > > > > >
> > > > > > > 2013/5/13 Jean-Marc Spaggiari <jean-marc@spaggiari.org>
> > > > > > >
> > > > > > >> Hi Bing,
> > > > > > >>
> > > > > > >> Which JDK are you using?
> > > > > > >>
> > > > > > >> Thanks,
> > > > > > >>
> > > > > > >> JM
> > > > > > >>
> > > > > > >> 2013/5/12 Bing Jiang <jiangbinglover@gmail.com>
> > > > > > >>
> > > > > > >> > Yes, we use hbase-0.94.3 , and  we change block.size
from
> 64k
> > to
> > > > > 32k.
> > > > > > >> >
> > > > > > >> >
> > > > > > >> > 2013/5/13 Ted Yu <yuzhihong@gmail.com>
> > > > > > >> >
> > > > > > >> > > Can you tell us the version of hbase you
are using ?
> > > > > > >> > > Did this problem happen recently ?
> > > > > > >> > >
> > > > > > >> > > Thanks
> > > > > > >> > >
> > > > > > >> > > On May 12, 2013, at 6:25 PM, Bing Jiang <
> > > > jiangbinglover@gmail.com
> > > > > >
> > > > > > >> > wrote:
> > > > > > >> > >
> > > > > > >> > > > Hi, all.
> > > > > > >> > > > In our hbase cluster, there are many
logs like below:
> > > > > > >> > > >
> > > > > > >> > > > 2013-05-13 00:00:04,161 ERROR
> > > > > > >> > > org.apache.hadoop.hbase.regionserver.HRegionServer:
> > > > > > >> > > > java.lang.IllegalArgumentException
> > > > > > >> > > >         at java.nio.Buffer.position(Buffer.java:216)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.blockSeek(HFileReaderV2.java:882)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$ScannerV2.loadBlockAndSeekToKey(HFileReaderV2.java:753)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:487)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:501)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:226)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:145)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:131)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > >
> > > org.apache.hadoop.hbase.regionserver.Store.getScanner(Store.java:2073)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:3412)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:1642)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1634)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1610)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4230)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4204)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.get(HRegionServer.java:2025)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.multi(HRegionServer.java:3461)
> > > > > > >> > > >         at
> > > > sun.reflect.GeneratedMethodAccessor30.invoke(Unknown
> > > > > > >> Source)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > > > > >> > > >         at
> > java.lang.reflect.Method.invoke(Method.java:597)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:364)
> > > > > > >> > > >         at
> > > > > > >> > >
> > > > > > >> >
> > > > > > >>
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1426)
> > > > > > >> > > >
> > > > > > >> > > >
> > > > > > >> > > >
> > > > > > >> > > > and Table config:
> > > > > > >> > > >
> > > > > > >> > > >
> > > > > > >> > > > Can anyone tell me how I can find the
reason about this?
> > > > > > >> > > >
> > > > > > >> > > > --
> > > > > > >> > > > Bing Jiang
> > > > > > >> > > > weibo: http://weibo.com/jiangbinglover
> > > > > > >> > > > BLOG: http://blog.sina.com.cn/jiangbinglover
> > > > > > >> > > > National Research Center for Intelligent
Computing
> Systems
> > > > > > >> > > > Institute of Computing technology
> > > > > > >> > > > Graduate University of Chinese Academy
of Science
> > > > > > >> > >
> > > > > > >> >
> > > > > > >> >
> > > > > > >> >
> > > > > > >> > --
> > > > > > >> > Bing Jiang
> > > > > > >> > Tel:(86)134-2619-1361
> > > > > > >> > weibo: http://weibo.com/jiangbinglover
> > > > > > >> > BLOG: http://blog.sina.com.cn/jiangbinglover
> > > > > > >> > National Research Center for Intelligent Computing
Systems
> > > > > > >> > Institute of Computing technology
> > > > > > >> > Graduate University of Chinese Academy of Science
> > > > > > >> >
> > > > > > >>
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > --
> > > > > > > Bing Jiang
> > > > > > > Tel:(86)134-2619-1361
> > > > > > > weibo: http://weibo.com/jiangbinglover
> > > > > > > BLOG: http://blog.sina.com.cn/jiangbinglover
> > > > > > > National Research Center for Intelligent Computing Systems
> > > > > > > Institute of Computing technology
> > > > > > > Graduate University of Chinese Academy of Science
> > > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > --
> > > > > > Bing Jiang
> > > > > > Tel:(86)134-2619-1361
> > > > > > weibo: http://weibo.com/jiangbinglover
> > > > > > BLOG: http://blog.sina.com.cn/jiangbinglover
> > > > > > National Research Center for Intelligent Computing Systems
> > > > > > Institute of Computing technology
> > > > > > Graduate University of Chinese Academy of Science
> > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Bing Jiang
> > > > Tel:(86)134-2619-1361
> > > > weibo: http://weibo.com/jiangbinglover
> > > > BLOG: http://blog.sina.com.cn/jiangbinglover
> > > > National Research Center for Intelligent Computing Systems
> > > > Institute of Computing technology
> > > > Graduate University of Chinese Academy of Science
> > > >
> > >
> >
> >
> >
> > --
> > Bing Jiang
> > Tel:(86)134-2619-1361
> > weibo: http://weibo.com/jiangbinglover
> > BLOG: http://blog.sina.com.cn/jiangbinglover
> > National Research Center for Intelligent Computing Systems
> > Institute of Computing technology
> > Graduate University of Chinese Academy of Science
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message