hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From divye sheth <divs.sh...@gmail.com>
Subject Re: RegionServer Shutdown
Date Thu, 09 Jul 2015 08:22:51 GMT
Hi,

No errors reported. I want to bring to your notice that this started after
I replaced the hadoop 2.2.0 jars in the hbase lib with the hadoop 2.6.0
jars.

Thanks!

On Thu, Jul 9, 2015 at 1:35 PM, Dejan Menges <dejan.menges@gmail.com> wrote:

> Hi,
>
> Can  you add -x in your start-hbase.sh and try to run it then, maybe it
> will tell you something more about some missing path/folder etc?
>
> On Thu, Jul 9, 2015 at 8:09 AM divye sheth <divs.sheth@gmail.com> wrote:
>
> > Hi Samir,
> >
> > While debugging I found the following. When I extract the hbase tar and
> run
> > it with making only configuration changes the script works fine i.e.
> > start-hbase.sh starts regionserver properly.
> >
> > Now since I am using hadoop-2.6.0, I replaced all the jars related to
> > hadoop in the $HBASE_HOME/lib with the 2.6.0 version, hadoop jars 2.2.0
> ->
> > 2.6.0.
> > This is the point after which the start-hbase.sh script fails. Htrace
> jars
> > are available in the hadoop and hbase classpath. I wonder what is wrong.
> >
> > Thanks!
> >
> > On Tue, Jul 7, 2015 at 6:32 PM, Samir Ahmic <ahmic.samir@gmail.com>
> wrote:
> >
> > > OK then it is something related to how classes are loaded in case you
> > start
> > > hbase with start-hbase.sh script.  start-hbase.sh script i using ssh
> for
> > > starting hbase daemons if you are running hbase in distributed mode
> (this
> > > is case in pseudo-distributed mode to) so i'm suggesting that you check
> > > your ssh config and check what env variables are loaded when you make
> ssh
> > > connection. Here is doc on hbase pseudo-distributed mode  :
> > > http://hbase.apache.org/book.html#quickstart_pseudo
> > >
> > > Best Regards
> > > Samir
> > >
> > > On Tue, Jul 7, 2015 at 2:33 PM, divye sheth <divs.sheth@gmail.com>
> > wrote:
> > >
> > > > Hi Samir,
> > > >
> > > > The output of hadoop classpath command lists the directory
> > > > $HADOOP_PREFIX/share/hadoop/common/lib/* inside this location resides
> > the
> > > > htrace-core-3.0.4.jar file.
> > > >
> > > > Could it be a version issue? Since hbase comes with
> > htrace-core-2.04.jar
> > > > And as I said, the regionserver starts fine if started with
> > > hbase-daemon.sh
> > > > start regionserver command.
> > > >
> > > > Thanks!
> > > >
> > > > On Tue, Jul 7, 2015 at 5:19 PM, Samir Ahmic <ahmic.samir@gmail.com>
> > > wrote:
> > > >
> > > > > Hi,
> > > > > It look like you are missing htrace jar in your hadoop classpath.
> You
> > > can
> > > > > check it with:
> > > > >  $ hadoop classpath | tr ":" "\n" | grep htrace
> > > > > If it is not in classpath you will need to include it in hadop
> > > classpth.
> > > > > HTrace jar is located in $HBASE_HOME/lib.
> > > > >
> > > > > Regards
> > > > > Samir
> > > > >
> > > > > On Tue, Jul 7, 2015 at 1:15 PM, divye sheth <divs.sheth@gmail.com>
> > > > wrote:
> > > > >
> > > > > > Hi,
> > > > > >
> > > > > > I have installed Hbase-0.98 over Hadoop 2.6.0 in a psuedo
> > distributed
> > > > > mode
> > > > > > with zookeeper managed seperately. Everything works fine and
I am
> > > even
> > > > > able
> > > > > > to access hbase cluster without any issues when started using
> > > > > > hbase-daemon.sh script.
> > > > > >
> > > > > > The problem I am facing is that the regionserver immediately
> shuts
> > > down
> > > > > on
> > > > > > startup when the cluster is started using the start-hbase.sh
> > script.
> > > > > >
> > > > > > I tried searching for the root cause/remedy online but to no
> avail.
> > > > > Please
> > > > > > find below the error log trace.
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > *2015-07-07 16:12:23,872 FATAL [regionserver60020]
> > > > > > regionserver.HRegionServer: ABORTING region server
> > > > fal-tb-01.fractal.com
> > > > > > <http://fal-tb-01.fractal.com>,60020,1436265741575: Unhandled:
> > > Region
> > > > > > server startup failedjava.io.IOException: Region server startup
> > > > > > failed        at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.convertThrowableToIOE(HRegionServer.java:2869)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResponse(HRegionServer.java:1356)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:899)
> > > > > > at java.lang.Thread.run(Thread.java:745)Caused by:
> > > > > > java.lang.NoClassDefFoundError: org/htrace/Trace        at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:214)
> > > > > > at com.sun.proxy.$Proxy19.getFileInfo(Unknown Source)      
 at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
> > > > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > > > > at java.lang.reflect.Method.invoke(Method.java:606)        at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> > > > > > at com.sun.proxy.$Proxy20.getFileInfo(Unknown Source)      
 at
> > > > > > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > > > > at java.lang.reflect.Method.invoke(Method.java:606)        at
> > > > > >
> > org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:294)
> > > > > > at com.sun.proxy.$Proxy21.getFileInfo(Unknown Source)      
 at
> > > > > > org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
> > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:409)
> > > > > > at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
> > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.setupWALAndReplication(HRegionServer.java:1587)
> > > > > > at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.handleReportForDutyResponse(HRegionServer.java:1340)
> > > > > > ... 2 moreCaused by: java.lang.ClassNotFoundException:
> > > > > > org.htrace.Trace        at
> > > > > > java.net.URLClassLoader$1.run(URLClassLoader.java:366)     
  at
> > > > > > java.net.URLClassLoader$1.run(URLClassLoader.java:355)     
  at
> > > > > > java.security.AccessController.doPrivileged(Native Method)
> > at
> > > > > > java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > at
> > > > > > java.lang.ClassLoader.loadClass(ClassLoader.java:425)      
 at
> > > > > > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > > at
> > > > > > java.lang.ClassLoader.loadClass(ClassLoader.java:358)      
 ...
> 27
> > > > more*
> > > > > > I verified that the htrace-core.jar is available in the
> classpath.
> > > Any
> > > > > > pointers would be highly appreciated.
> > > > > >
> > > > > > Thanks
> > > > > > Divye Sheth
> > > > > >
> > > > >
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message