hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jean-Daniel Cryans" <jdcry...@apache.org>
Subject Re: HBase 0.19.0 RC:Could not obtain block error
Date Mon, 19 Jan 2009 12:54:51 GMT
Glad to know it works for you!

I guess, for 0.19, that we should tell people to change that stuff right
away just to be sure they don't run into that kind of problem.

J-D

On Mon, Jan 19, 2009 at 7:06 AM, Genady <genadyg@exelate.com> wrote:

> Hi,
>
> Thanks for your help, your guess about HDFS problems to direct me about
> where to look for a problems, after looking on logs I've found that my
> configuration is missing xcivers setting
> tuning(http://wiki.apache.org/hadoop/Hbase/Troubleshooting#5) and
> HBASE-1132
> problem, which was fixed in HBasse 19 RC2.
>
> Thanks,
> Gennady
>
> -----Original Message-----
> From: jdcryans@gmail.com [mailto:jdcryans@gmail.com] On Behalf Of
> Jean-Daniel Cryans
> Sent: Sunday, January 18, 2009 10:34 PM
> To: hbase-user@hadoop.apache.org
> Subject: Re: HBase 0.19.0 RC:Could not obtain block error
>
> Genady,
>
> That type of error means that HDFS is having some difficulties serving
> HBase. Things you should do:
>
> - Look in your datanodes logs for any signs of
> http://wiki.apache.org/hadoop/Hbase/FAQ#6 and btw, what you referred to
> Hadoop logs is in fact HBase logs.
>
> - Tell us about your hardware.
>
> - Try to know how many regions were in your table when it began to go
> wrong.
> Also check if your nodes are swapping (or any other sign of overloading).
>
> Thx,
>
> J-D
>
> On Sun, Jan 18, 2009 at 3:15 PM, Genady Gillin <genadyg@exelate.com>
> wrote:
>
> > Hi,
> >
> >
> > I'm trying to use Hadoop 0.19/HBase 0.19.0 RC with four nodes, after
> > inserting about 200G of data MR task is start to fail with *
> > NoServerForRegionException*, region server log is full of "Could not
> obtain
> > block */hbase/.META./1028785192/info/mapfiles/722281705778744225/data*"
> > exceptions, restart of Hadoop/Hbase doesn't help, is there any workaround
> > solve this?
> > BTW: data is there and could be seen with dfs -cat.
> >
> > Hadoop settings:
> > dfs.replication=2,
> > *// as recommended in troubles section of hbase site*
> > dfs.datanode.handler.count=5
> > dfs.datanode.socket.write.timeout=0
> >
> > Any help would be appreciated,
> > Gennady
> >
> > HBase logs:
> >
> > *org.apache.hadoop.hbase.client.NoServerForRegionException: Timed out
> > trying
> > to locate root region
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRootReg
> ion(HConnectionManager.java:768)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(
> HConnectionManager.java:448)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegio
> n(HConnectionManager.java:430)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionI
> nMeta(HConnectionManager.java:557)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(
> HConnectionManager.java:457)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.relocateRegio
> n(HConnectionManager.java:430)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegionI
> nMeta(HConnectionManager.java:557)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(
> HConnectionManager.java:461)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.locateRegion(
> HConnectionManager.java:423)
> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:114)
> >        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:97)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.mapred.TableOutputFormat.getRecordWriter(TableOutput
> Format.java:90)
> >        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:399)
> >        at org.apache.hadoop.mapred.Child.main(Child.java:155)*
> >
> >
> > Hadoop logs:
> >
> >
> > *java.io.IOException: java.io.IOException: HStoreScanner failed
> > construction
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:70)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java
> :88)
> >        at
> > org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:19
> 89)
> >        at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer
> .java:1700)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
> >        at
> >
> >
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:894)
> > Caused by: java.io.IOException: Could not obtain block:
> > blk_-5027942609246712759_11342
> > file=/hbase/.META./1028785192/info/mapfiles/722281705778744225/index
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.jav
> a:1708)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1
> 536)
> >        at
> > org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1663)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.init(SequenceFile.java:1464)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1442
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426
> )
> >        at
> org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:301)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java
> :79)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFi
> le.java:65)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:44
> 3)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileS
> canner.java:96)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:67)
> >        ... 11 more
> >
> >        at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown
> > Source)
> >        at
> >
> >
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
> torAccessorImpl.java:27)
> >        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteE
> xceptionHandler.java:95)
> >        at
> >
> org.apache.hadoop.hbase.master.BaseScanner.scanRegion(BaseScanner.java:185)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.scanOneMetaRegion(MetaScanner.jav
> a:73)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.maintenanceScan(MetaScanner.java:
> 129)
> >        at
> > org.apache.hadoop.hbase.master.BaseScanner.chore(BaseScanner.java:137)
> >        at org.apache.hadoop.hbase.Chore.run(Chore.java:65)
> > 2009-01-18 14:19:06,038 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > All
> > 1 .META. region(s) scanned
> > 2009-01-18 14:19:56,300 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > RegionManager.metaScanner scanning meta region {regionname: .META.,,1,
> > startKey: <>, server: 10.1.0.56:60020}
> > 2009-01-18 14:20:04,389 INFO
> org.apache.hadoop.hbase.master.ServerManager:
> > 10.1.0.56:60020 lease expired
> > 2009-01-18 14:20:04,411 INFO
> org.apache.hadoop.hbase.master.ServerManager:
> > 10.1.0.52:60020 lease expired
> > 2009-01-18 14:20:04,411 INFO
> org.apache.hadoop.hbase.master.ServerManager:
> > 10.1.0.50:60020 lease expired
> > 2009-01-18 14:20:05,402 INFO
> org.apache.hadoop.hbase.master.ServerManager:
> > 10.1.0.60:60020 lease expired
> > 2009-01-18 14:20:06,101 WARN org.apache.hadoop.hbase.master.BaseScanner:
> > Scan one META region: {regionname: .META.,,1, startKey: <>, server:
> > 10.1.0.56:60020}
> > java.io.IOException: java.io.IOException: HStoreScanner failed
> construction
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:70)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java
> :88)
> >        at
> > org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:19
> 89)
> >        at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer
> .java:1700)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
> >        at
> >
> >
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:894)
> > Caused by: java.io.IOException: Could not obtain block:
> > blk_-80256631922521533_11342
> > file=/hbase/.META./1028785192/info/mapfiles/722281705778744225/data
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.jav
> a:1708)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1
> 536)
> >        at
> > org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1663)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.init(SequenceFile.java:1464)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1442
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:
> 310)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBa
> seMapFile.java:96)
> >        at
> org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java
> :79)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFi
> le.java:65)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:44
> 3)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileS
> canner.java:96)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:67)
> >        ... 11 more
> >
> >        at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown
> > Source)
> >        at
> >
> >
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
> torAccessorImpl.java:27)
> >        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteE
> xceptionHandler.java:95)
> >        at
> >
> org.apache.hadoop.hbase.master.BaseScanner.scanRegion(BaseScanner.java:185)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.scanOneMetaRegion(MetaScanner.jav
> a:73)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.maintenanceScan(MetaScanner.java:
> 129)
> >        at
> > org.apache.hadoop.hbase.master.BaseScanner.chore(BaseScanner.java:137)
> >        at org.apache.hadoop.hbase.Chore.run(Chore.java:65)
> > 2009-01-18 14:20:06,106 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > All
> > 1 .META. region(s) scanned
> > 2009-01-18 14:20:56,306 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > RegionManager.metaScanner scanning meta region {regionname: .META.,,1,
> > startKey: <>, server: 10.1.0.56:60020}
> > 2009-01-18 14:21:05,379 WARN org.apache.hadoop.hbase.master.BaseScanner:
> > Scan one META region: {regionname: .META.,,1, startKey: <>, server:
> > 10.1.0.56:60020}
> > java.io.IOException: java.io.IOException: HStoreScanner failed
> construction
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:70)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java
> :88)
> >        at
> > org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:19
> 89)
> >        at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer
> .java:1700)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
> >        at
> >
> >
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:894)
> > Caused by: java.io.IOException: Could not obtain block:
> > blk_-80256631922521533_11342
> > file=/hbase/.META./1028785192/info/mapfiles/722281705778744225/data
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.jav
> a:1708)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1
> 536)
> >        at
> > org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1663)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.init(SequenceFile.java:1464)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1442
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:
> 310)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBa
> seMapFile.java:96)
> >        at
> org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java
> :79)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFi
> le.java:65)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:44
> 3)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileS
> canner.java:96)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:67)
> >        ... 11 more
> >
> >        at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown
> > Source)
> >        at
> >
> >
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
> torAccessorImpl.java:27)
> >        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteE
> xceptionHandler.java:95)
> >        at
> >
> org.apache.hadoop.hbase.master.BaseScanner.scanRegion(BaseScanner.java:185)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.scanOneMetaRegion(MetaScanner.jav
> a:73)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.maintenanceScan(MetaScanner.java:
> 129)
> >        at
> > org.apache.hadoop.hbase.master.BaseScanner.chore(BaseScanner.java:137)
> >        at org.apache.hadoop.hbase.Chore.run(Chore.java:65)
> > 2009-01-18 14:21:05,381 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > All
> > 1 .META. region(s) scanned
> > 2009-01-18 14:21:56,311 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > RegionManager.metaScanner scanning meta region {regionname: .META.,,1,
> > startKey: <>, server: 10.1.0.56:60020}
> > 2009-01-18 14:22:05,354 WARN org.apache.hadoop.hbase.master.BaseScanner:
> > Scan one META region: {regionname: .META.,,1, startKey: <>, server:
> > 10.1.0.56:60020}
> > java.io.IOException: java.io.IOException: HStoreScanner failed
> construction
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:70)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java
> :88)
> >        at
> > org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:19
> 89)
> >        at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer
> .java:1700)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
> >        at
> >
> >
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:894)
> > Caused by: java.io.IOException: Could not obtain block:
> > blk_-80256631922521533_11342
> > file=/hbase/.META./1028785192/info/mapfiles/722281705778744225/data
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.jav
> a:1708)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1
> 536)
> >        at
> > org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1663)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.init(SequenceFile.java:1464)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1442
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:
> 310)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBa
> seMapFile.java:96)
> >        at
> org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java
> :79)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFi
> le.java:65)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:44
> 3)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileS
> canner.java:96)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:67)
> >        ... 11 more
> >
> >        at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown
> > Source)
> >        at
> >
> >
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
> torAccessorImpl.java:27)
> >        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteE
> xceptionHandler.java:95)
> >        at
> >
> org.apache.hadoop.hbase.master.BaseScanner.scanRegion(BaseScanner.java:185)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.scanOneMetaRegion(MetaScanner.jav
> a:73)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.maintenanceScan(MetaScanner.java:
> 129)
> >        at
> > org.apache.hadoop.hbase.master.BaseScanner.chore(BaseScanner.java:137)
> >        at org.apache.hadoop.hbase.Chore.run(Chore.java:65)
> > 2009-01-18 14:22:05,356 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > All
> > 1 .META. region(s) scanned
> > 2009-01-18 14:22:56,317 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > RegionManager.metaScanner scanning meta region {regionname: .META.,,1,
> > startKey: <>, server: 10.1.0.56:60020}
> > 2009-01-18 14:23:07,103 WARN org.apache.hadoop.hbase.master.BaseScanner:
> > Scan one META region: {regionname: .META.,,1, startKey: <>, server:
> > 10.1.0.56:60020}
> > java.io.IOException: java.io.IOException: HStoreScanner failed
> construction
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:70)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java
> :88)
> >        at
> > org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:19
> 89)
> >        at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer
> .java:1700)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> >
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
> >        at
> >
> >
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:894)
> > Caused by: java.io.IOException: Could not obtain block:
> > blk_-80256631922521533_11342
> > file=/hbase/.META./1028785192/info/mapfiles/722281705778744225/data
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.jav
> a:1708)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1
> 536)
> >        at
> > org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1663)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >        at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.init(SequenceFile.java:1464)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1442
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426
> )
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:
> 310)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBa
> seMapFile.java:96)
> >        at
> org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java
> :79)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFi
> le.java:65)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:44
> 3)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileS
> canner.java:96)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanne
> r.java:67)
> >        ... 11 more
> >
> >        at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown
> > Source)
> >        at
> >
> >
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruc
> torAccessorImpl.java:27)
> >        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteE
> xceptionHandler.java:95)
> >        at
> >
> org.apache.hadoop.hbase.master.BaseScanner.scanRegion(BaseScanner.java:185)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.scanOneMetaRegion(MetaScanner.jav
> a:73)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.master.MetaScanner.maintenanceScan(MetaScanner.java:
> 129)
> >        at
> > org.apache.hadoop.hbase.master.BaseScanner.chore(BaseScanner.java:137)
> >        at org.apache.hadoop.hbase.Chore.run(Chore.java:65)
> > 2009-01-18 14:23:07,106 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > All
> > 1 .META. region(s) scanned
> > 2009-01-18 14:23:56,323 INFO org.apache.hadoop.hbase.master.BaseScanner:
> > RegionManager.metaScanner scanning meta region {regionname: .META.,,1,
> > startKey: <>, server: 10.1.0.56:60020}
> > 2009-01-18 14:24:03,923 WARN org.apache.hadoop.hbase.RegionHistorian:
> > Unable
> > to 'Region opened on server : 3c.c3.33.static.xlhost.com'
> > org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
> contact
> > region server 10.1.0.56:60020 for region .META.,,1, row
> > 'step0,f50eba0233a6f9c4cc369d68ca63b2a4,1232286170694', but failed after
> 10
> > attempts.
> > Exceptions:
> > java.io.IOException: java.io.IOException: Could not read from stream
> >        at
> > org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:119)
> >        at java.io.DataInputStream.readByte(DataInputStream.java:248)
> >        at
> > org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:325)
> >        at
> > org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:346)
> >        at org.apache.hadoop.io.Text.readString(Text.java:400)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFS
> Client.java:2779)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSCl
> ient.java:2704)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:
> 1997)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.
> java:2183)
> >
> > java.io.IOException: java.io.IOException: Could not obtain block:
> > blk_2410490605807408170_12109
> > file=/hbase/-ROOT-/70236052/info/mapfiles/4049664282920944880/data
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.jav
> a:1708)
> >        at
> >
> >
>
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1
> 536)
> >        at
> > org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1663)
> >        at
> > org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1593)
> >        at java.io.DataInputStream.readInt(DataInputStream.java:370)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.io.SequenceFile$Reader.readRecordLength(SequenceFile
> .java:1909)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.next(SequenceFile.java:1939)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.next(SequenceFile.java:1844)
> >        at
> >
> org.apache.hadoop.hbase.io.SequenceFile$Reader.next(SequenceFile.java:1890)
> >        at
> org.apache.hadoop.hbase.io.MapFile$Reader.next(MapFile.java:525)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStore.rowAtOrBeforeFromMapFile(HStore.
> java:1714)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HStore.getRowKeyAtOrBefore(HStore.java:
> 1686)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegion.getClosestRowBefore(HRegion.jav
> a:1088)
> >        at
> >
> >
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.getClosestRowBefore(HRegi
> onServer.java:1548)
> >        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
> >        at
> >
> >
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
> >        at
> > org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:894)
> >
> > org.apache.hadoop.hbase.client.NoServerForRegionException: Timed out
> trying
> > to locate root region
> > *
> >
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message