hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Misty Stanley-Jones <mstanleyjo...@cloudera.com>
Subject Re: Error while HBase from source code
Date Wed, 03 Sep 2014 23:26:55 GMT
I think somehow you must be looking at a different hbase-site.xml file. Can
you please create a new user on your system and run through the whole
Quickstart chapter (http://hbase.apache.org/book.html#quickstart) with
everything fresh, and let us know what happens? You certainly shouldn't
have any blockcache configuration in a default configuration.


On Tue, Sep 2, 2014 at 5:12 AM, @Sanjiv Singh <sanjiv.is.on@gmail.com>
wrote:

> I checked at both hbase-default.xml and hbase-site.xml in my code base.
>
> There is nothing defined in hbase-site.xml for this property and no value
> defined for property "hbase.bucketcache.ioengine" in hbase-default.xml. I
> am not sure what's wrong with it.
>
> See for default value :
>
> https://github.com/apache/hbase/blob/master/hbase-common/src/main/resources/hbase-default.xml
>
> when I changed it to 'heap' in my hbase-site.xml , now getting different
> error  :
>
> Exception in thread "main" java.lang.RuntimeException: Current heap
> configuration for MemStore and BlockCache exceeds the threshold required
> for successful cluster operation. The combined value cannot exceed 0.8.
> Please check the settings for hbase.regionserver.global.memstore.size and
> hfile.block.cache.size in your configuration.
> hbase.regionserver.global.memstore.size is 0.4 hfile.block.cache.size is
> 68.09002
>  at
>
> org.apache.hadoop.hbase.io.util.HeapMemorySizeUtil.checkForClusterFreeMemoryLimit(HeapMemorySizeUtil.java:64)
> at
>
> org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:82)
>  at
>
> org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:92)
> at org.apache.hadoop.hbase.util.HBaseConfTool.main(HBaseConfTool.java:39)
>
>
>
>
>
> Regards
> Sanjiv Singh
> Mob :  +091 9990-447-339
>
>
> On Tue, Sep 2, 2014 at 5:09 PM, tobe <tobeg3oogle@gmail.com> wrote:
>
> > By default, HBase use onheap LruBlockCache which doesn't require to
> > configure bucketcache.ioengine. Please double check your configuration
> > file, hbase-site.xml. It would be ok if it doesn't have any property.
> >
> >
> > On Tue, Sep 2, 2014 at 6:29 PM, @Sanjiv Singh <sanjiv.is.on@gmail.com>
> > wrote:
> >
> >> Looking at the source-code , it seems like "hbase.bucketcache.ioengine"
> >>  must be set to one of these "file:" , "offheap" , "heap".
> >>
> >> it might help in debugging the issue .
> >>
> >>
> >> Regards
> >> Sanjiv Singh
> >> Mob :  +091 9990-447-339
> >>
> >>
> >> On Tue, Sep 2, 2014 at 3:45 PM, @Sanjiv Singh <sanjiv.is.on@gmail.com>
> >> wrote:
> >>
> >>> Hi Tobe,
> >>>
> >>> I have removed HADOOP_HOME and HBASE_HOME , now i am getting different
> >>> error like below \
> >>>
> >>> Caused by: java.lang.IllegalArgumentException: Don't understand io
> >>> engine name for cache - prefix with file:, heap or offheap
> >>> at
> >>>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getIOEngineFromName(BucketCache.java:302)
> >>>
> >>> See below for complete startup logs :
> >>>
> >>> 2014-09-02 15:38:10,084 INFO  [main] util.VersionInfo: Compiled by
> >>> impadmin on Tue Sep  2 15:03:37 IST 2014
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09
> GMT
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:host.name=IMPETUS-NL147centos.impetus.co.in
> >>>  2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:java.version=1.7.0
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:java.vendor=Oracle Corporation
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:java.home=/usr/java/jdk1.7.0/jre
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>>
> environment:java.class.path=/usr/local/hbase/hbase-2.0.0-SNAPSHOT/bin/../conf:/usr/java/jdk1.7.0/lib/tools.jar:/usr/local/hbase/hbase-2.0.0-SNAPSHOT/bin/..:/usr/local/hbase/hbase-2.0.0-
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>>
> environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:java.io.tmpdir=/tmp
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:java.compiler=<NA>
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:os.name=Linux
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:os.arch=amd64
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:os.version=2.6.32-358.el6.x86_64
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:user.name=impadmin
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:user.home=/home/impadmin
> >>> 2014-09-02 15:38:10,626 INFO  [main] server.ZooKeeperServer: Server
> >>> environment:user.dir=/usr/local/hbase/hbase-2.0.0-SNAPSHOT
> >>> 2014-09-02 15:38:10,657 INFO  [main] server.ZooKeeperServer: Created
> >>> server with tickTime 2000 minSessionTimeout 4000 maxSessionTimeout
> 40000
> >>> datadir /tmp/hbase-impadmin/zookeeper/zookeeper_0/version-2 snapdir
> >>> /tmp/hbase-impadmin/zookeeper/zookeeper_0/version-2
> >>> 2014-09-02 15:38:10,704 INFO  [main] server.NIOServerCnxnFactory:
> >>> binding to port 0.0.0.0/0.0.0.0:2181
> >>> 2014-09-02 15:38:11,291 INFO  [NIOServerCxn.Factory:
> 0.0.0.0/0.0.0.0:2181]
> >>> server.NIOServerCnxnFactory: Accepted socket connection from /
> >>> 127.0.0.1:49144
> >>> 2014-09-02 15:38:11,296 INFO  [NIOServerCxn.Factory:
> 0.0.0.0/0.0.0.0:2181]
> >>> server.NIOServerCnxn: Processing stat command from /127.0.0.1:49144
> >>> 2014-09-02 15:38:11,301 INFO  [Thread-2] server.NIOServerCnxn: Stat
> >>> command output
> >>> 2014-09-02 15:38:11,302 INFO  [Thread-2] server.NIOServerCnxn: Closed
> >>> socket connection for client /127.0.0.1:49144 (no session established
> >>> for client)
> >>> 2014-09-02 15:38:11,302 INFO  [main] zookeeper.MiniZooKeeperCluster:
> >>> Started MiniZK Cluster and connect 1 ZK server on client port: 2181
> >>> 2014-09-02 15:38:12,066 INFO  [main] regionserver.RSRpcServices:
> master/
> >>> IMPETUS-NL147centos.impetus.co.in/127.0.1.1:0 server-side HConnection
> >>> retries=350
> >>> 2014-09-02 15:38:12,517 INFO  [main] ipc.SimpleRpcScheduler: Using
> >>> deadline as user call queue, count=3
> >>> 2014-09-02 15:38:12,543 INFO  [main] ipc.RpcServer: master/
> >>> IMPETUS-NL147centos.impetus.co.in/127.0.1.1:0: started 10 reader(s).
> >>> 2014-09-02 15:38:12,646 INFO  [main] impl.MetricsConfig: loaded
> >>> properties from hadoop-metrics2-hbase.properties
> >>> 2014-09-02 15:38:12,710 INFO  [main] impl.MetricsSystemImpl: Scheduled
> >>> snapshot period at 10 second(s).
> >>> 2014-09-02 15:38:12,710 INFO  [main] impl.MetricsSystemImpl: HBase
> >>> metrics system started
> >>> 2014-09-02 15:38:12,887 WARN  [main] util.NativeCodeLoader: Unable to
> >>> load native-hadoop library for your platform... using builtin-java
> classes
> >>> where applicable
> >>> 2014-09-02 15:38:12,915 INFO  [main] hfile.CacheConfig: Allocating
> >>> LruBlockCache size=386.70 MB, blockSize=64 KB
> >>> 2014-09-02 15:38:12,949 ERROR [main] master.HMasterCommandLine: Master
> >>> exiting
> >>> java.lang.RuntimeException: Failed construction of Master: class
> >>> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster
> >>>  at
> >>>
> org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:145)
> >>> at
> >>>
> org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:214)
> >>>  at
> >>>
> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:152)
> >>> at
> >>>
> org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:185)
> >>>  at
> >>>
> org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:139)
> >>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >>>  at
> >>>
> org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
> >>> at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1764)
> >>> Caused by: java.lang.IllegalArgumentException: Don't understand io
> >>> engine name for cache - prefix with file:, heap or offheap
> >>> at
> >>>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.getIOEngineFromName(BucketCache.java:302)
> >>>  at
> >>>
> org.apache.hadoop.hbase.io.hfile.bucket.BucketCache.<init>(BucketCache.java:218)
> >>> at
> >>>
> org.apache.hadoop.hbase.io.hfile.CacheConfig.getL2(CacheConfig.java:496)
> >>>  at
> >>>
> org.apache.hadoop.hbase.io.hfile.CacheConfig.instantiateBlockCache(CacheConfig.java:517)
> >>> at
> >>>
> org.apache.hadoop.hbase.io.hfile.CacheConfig.<init>(CacheConfig.java:206)
> >>>  at
> >>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:478)
> >>> at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:264)
> >>>  at
> >>>
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:266)
> >>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> >>>  at
> >>>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> >>> at
> >>>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> >>>  at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
> >>> at
> >>>
> org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:141)
> >>>  ... 7 more
> >>>
> >>> Regards
> >>> Sanjiv Singh
> >>> Mob :  +091 9990-447-339
> >>>
> >>>
> >>> On Tue, Sep 2, 2014 at 3:30 PM, tobe <tobeg3oogle@gmail.com> wrote:
> >>>
> >>>> I think you should NOT set HADOOP_HOME and HBASE_HOME, which have made
> >>>> some trouble for me.
> >>>>
> >>>>
> >>>> On Tue, Sep 2, 2014 at 5:48 PM, @Sanjiv Singh <sanjiv.is.on@gmail.com
> >
> >>>> wrote:
> >>>>
> >>>>> I just checked, following are defined for these.
> >>>>>
> >>>>> # pointing to hadoop 1.x
> >>>>> HADOOP_HOME=/usr/local/hadoop/hadoop-1.2.1/
> >>>>>
> >>>>> # directory which is created after
> >>>>> extracting hbase-2.0.0-SNAPSHOT-bin.tar.gz
> >>>>>
> >>>>> HBASE_HOME=/usr/local/hbase/hbase-2.0.0-SNAPSHOT/
> >>>>>
> >>>>> Regards
> >>>>> Sanjiv Singh
> >>>>> Mob :  +091 9990-447-339
> >>>>>
> >>>>>
> >>>>> On Tue, Sep 2, 2014 at 2:35 PM, tobe <tobeg3oogle@gmail.com> wrote:
> >>>>>
> >>>>>> The default configuration should work well. Check if you set
> >>>>>> $HADOOP_HOME and $HBASE_HOME or not.
> >>>>>>
> >>>>>>
> >>>>>> On Tue, Sep 2, 2014 at 5:00 PM, @Sanjiv Singh <
> sanjiv.is.on@gmail.com
> >>>>>> > wrote:
> >>>>>>
> >>>>>>> Hi Dima,
> >>>>>>>
> >>>>>>> It's standalone mode where all daemons in one JVM. I have not
> >>>>>>> changed any
> >>>>>>> single configuration , i tried to start hbase with all default
> >>>>>>> configuration.
> >>>>>>>
> >>>>>>> Let me know if need info to debug.
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> Regards
> >>>>>>> Sanjiv Singh
> >>>>>>> Mob :  +091 9990-447-339
> >>>>>>>
> >>>>>>>
> >>>>>>> On Tue, Sep 2, 2014 at 1:21 PM, Dima Spivak <dspivak@cloudera.com>
> >>>>>>> wrote:
> >>>>>>>
> >>>>>>> > Hi Sanjiv,
> >>>>>>> >
> >>>>>>> > Are you looking to run HBase in standalone mode (all daemons in
> >>>>>>> one JVM)
> >>>>>>> > or pseudo-distributed mode (each with its own process, but on one
> >>>>>>> host)?
> >>>>>>> > Have you tried following the instructions on
> >>>>>>> > http://hbase.apache.org/book.html regarding configurations to
> >>>>>>> switch
> >>>>>>> > between the two?
> >>>>>>> >
> >>>>>>> > All the best,
> >>>>>>> >    Dima
> >>>>>>> >
> >>>>>>> >
> >>>>>>> > On Mon, Sep 1, 2014 at 10:53 PM, @Sanjiv Singh <
> >>>>>>> sanjiv.is.on@gmail.com>
> >>>>>>> > wrote:
> >>>>>>> >
> >>>>>>> >> Any help on this ...issue is still not resolved.
> >>>>>>> >>
> >>>>>>> >> Regards
> >>>>>>> >> Sanjiv Singh
> >>>>>>> >> Mob :  +091 9990-447-339
> >>>>>>> >>
> >>>>>>> >>
> >>>>>>> >> On Mon, Sep 1, 2014 at 5:31 PM, @Sanjiv Singh <
> >>>>>>> sanjiv.is.on@gmail.com>
> >>>>>>> >> wrote:
> >>>>>>> >>
> >>>>>>> >> > Hi Matteo,
> >>>>>>> >> >
> >>>>>>> >> > Following your steps, I was able to build binary package
> >>>>>>> hbase-2.
> >>>>>>> >> > 0.0-SNAPSHOT-bin.tar.gz ..today I used the same to run hbase
> >>>>>>> locally
> >>>>>>> >>
> >>>>>>> >> >  it is giving error when i tried to start with start-hbase.sh
> >>>>>>> >> >
> >>>>>>> >> > I am totally lost with it , as don't have clue about the
> error.
> >>>>>>> Please
> >>>>>>> >> > help me on this.
> >>>>>>> >> >
> >>>>>>> >> > Following are logs generated .in logs directory  :
> >>>>>>> >> >
> >>>>>>> >> > 2014-09-01 17:22:46,096 INFO  [main] server.ZooKeeperServer:
> >>>>>>> Server
> >>>>>>> >> > environment:user.name=impadmin
> >>>>>>> >> > 2014-09-01 17:22:46,096 INFO  [main] server.ZooKeeperServer:
> >>>>>>> Server
> >>>>>>> >> > environment:user.home=/home/impadmin
> >>>>>>> >> > 2014-09-01 17:22:46,096 INFO  [main] server.ZooKeeperServer:
> >>>>>>> Server
> >>>>>>> >> > environment:user.dir=/usr/local/hbase/hbase-2.0.0-SNAPSHOT
> >>>>>>> >> > 2014-09-01 17:22:46,109 INFO  [main] server.ZooKeeperServer:
> >>>>>>> Created
> >>>>>>> >> > server with tickTime 2000 minSessionTimeout 4000
> >>>>>>> maxSessionTimeout 40000
> >>>>>>> >> > datadir /tmp/hbase-impad$
> >>>>>>> >> > 2014-09-01 17:22:46,121 INFO  [main]
> >>>>>>> server.NIOServerCnxnFactory:
> >>>>>>> >> binding
> >>>>>>> >> > to port 0.0.0.0/0.0.0.0:2181
> >>>>>>> >> > 2014-09-01 17:22:46,660 INFO  [NIOServerCxn.Factory:
> >>>>>>> >> 0.0.0.0/0.0.0.0:2181]
> >>>>>>> >> > server.NIOServerCnxnFactory: Accepted socket connection from /
> >>>>>>> >> > 127.0.0.1:34636
> >>>>>>> >> > 2014-09-01 17:22:46,667 INFO  [NIOServerCxn.Factory:
> >>>>>>> >> 0.0.0.0/0.0.0.0:2181]
> >>>>>>> >> > server.NIOServerCnxn: Processing stat command from /
> >>>>>>> 127.0.0.1:34636
> >>>>>>> >> > 2014-09-01 17:22:46,672 INFO  [Thread-2] server.NIOServerCnxn:
> >>>>>>> Stat
> >>>>>>> >> > command output
> >>>>>>> >> > 2014-09-01 17:22:46,673 INFO  [Thread-2] server.NIOServerCnxn:
> >>>>>>> Closed
> >>>>>>> >> > socket connection for client /127.0.0.1:34636 (no session
> >>>>>>> established
> >>>>>>> >> for
> >>>>>>> >> > client)
> >>>>>>> >> > 2014-09-01 17:22:46,673 INFO  [main]
> >>>>>>> zookeeper.MiniZooKeeperCluster:
> >>>>>>> >> > Started MiniZK Cluster and connect 1 ZK server on client port:
> >>>>>>> 2181
> >>>>>>> >> > 2014-09-01 17:22:47,067 INFO  [main]
> >>>>>>> regionserver.RSRpcServices: master/
> >>>>>>> >> > IMPETUS-NL147centos.impetus.co.in/127.0.1.1:0 server-side
> >>>>>>> HConnection
> >>>>>>> >> > retries=350
> >>>>>>> >> > 2014-09-01 17:22:47,221 INFO  [main] ipc.SimpleRpcScheduler:
> >>>>>>> Using
> >>>>>>> >> > deadline as user call queue, count=3
> >>>>>>> >> > 2014-09-01 17:22:47,234 INFO  [main] ipc.RpcServer: master/
> >>>>>>> >> > IMPETUS-NL147centos.impetus.co.in/127.0.1.1:0: started 10
> >>>>>>> reader(s).
> >>>>>>> >> > 2014-09-01 17:22:47,297 INFO  [main] impl.MetricsConfig:
> loaded
> >>>>>>> >> properties
> >>>>>>> >> > from hadoop-metrics2-hbase.properties
> >>>>>>> >> > 2014-09-01 17:22:47,324 INFO  [main] impl.MetricsSystemImpl:
> >>>>>>> Scheduled
> >>>>>>> >> > snapshot period at 10 second(s).
> >>>>>>> >> > 2014-09-01 17:22:47,324 INFO  [main] impl.MetricsSystemImpl:
> >>>>>>> HBase
> >>>>>>> >> metrics
> >>>>>>> >> > system started
> >>>>>>> >> > 2014-09-01 17:22:47,428 ERROR [main]
> master.HMasterCommandLine:
> >>>>>>> Master
> >>>>>>> >> > exiting
> >>>>>>> >> > java.lang.RuntimeException: Failed construction of Master:
> class
> >>>>>>> >> >
> >>>>>>> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMasternull
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:145)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:214)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:152)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:185)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:139)
> >>>>>>> >> >         at
> >>>>>>> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
> >>>>>>> >> >         at
> >>>>>>> >> org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1764)
> >>>>>>> >> > Caused by: java.lang.RuntimeException:
> >>>>>>> >> > java.lang.reflect.InvocationTargetException
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
> >>>>>>> >> >         at
> >>>>>>> org.apache.hadoop.security.Groups.<init>(Groups.java:64)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:309)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:303)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.security.User$SecureHadoopUser.isSecurityEnabled(User.java:349)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.security.User$SecureHadoopUser.login(User.java:340)
> >>>>>>> >> >         at
> >>>>>>> org.apache.hadoop.hbase.security.User.login(User.java:208)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.security.UserProvider.login(UserProvider.java:116)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.login(HRegionServer.java:526)
> >>>>>>> >> >         at
> >>>>>>> >> org.apache.hadoop.hbase.master.HMaster.login(HMaster.java:338)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:475)
> >>>>>>> >> >         at
> >>>>>>> >> org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:264)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:266)
> >>>>>>> >> >         at
> >>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >>>>>>> >> > Method)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> >>>>>>> >> >         at
> >>>>>>> >> java.lang.reflect.Constructor.newInstance(Constructor.java:525)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:141)
> >>>>>>> >> > Caused by: java.lang.reflect.InvocationTargetException
> >>>>>>> >> >         at
> >>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >>>>>>> >> > Method)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> >>>>>>> >> >         at
> >>>>>>> >> java.lang.reflect.Constructor.newInstance(Constructor.java:525)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
> >>>>>>> >> >         ... 27 more
> >>>>>>> >> > Caused by: java.lang.UnsatisfiedLinkError:
> >>>>>>> >> >
> >>>>>>>
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>>
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
> >>>>>>> >> > Method)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
> >>>>>>> >> >         at
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>>
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)
> >>>>>>> >> >         ... 32 more
> >>>>>>> >> >
> >>>>>>> >> >
> >>>>>>> >> >
> >>>>>>> >> > Regards
> >>>>>>> >> > Sanjiv Singh
> >>>>>>> >> > Mob :  +091 9990-447-339
> >>>>>>> >> >
> >>>>>>> >> >
> >>>>>>> >> > On Fri, Aug 22, 2014 at 9:18 PM, @Sanjiv Singh <
> >>>>>>> sanjiv.is.on@gmail.com>
> >>>>>>> >> > wrote:
> >>>>>>> >> >
> >>>>>>> >> >> Hi Matteo,
> >>>>>>> >> >>
> >>>>>>> >> >> I cleaned up all,  started from scratch and followed all
> steps
> >>>>>>> provided
> >>>>>>> >> >> by you ...it worked this time without any issue.
> >>>>>>> >> >>
> >>>>>>> >> >> I really don't know what was the issue.  thank you very much
> >>>>>>> for the
> >>>>>>> >> >> help.  i will move forward with this.
> >>>>>>> >> >>
> >>>>>>> >> >> Regards
> >>>>>>> >> >> Sanjiv Singh
> >>>>>>> >> >> Mob :  +091 9990-447-339
> >>>>>>> >> >>
> >>>>>>> >> >>
> >>>>>>> >> >> On Fri, Aug 22, 2014 at 8:51 PM, @Sanjiv Singh <
> >>>>>>> sanjiv.is.on@gmail.com
> >>>>>>> >> >
> >>>>>>> >> >> wrote:
> >>>>>>> >> >>
> >>>>>>> >> >>> my maven  :
> >>>>>>> >> >>>
> >>>>>>> >> >>> $ mvn --version
> >>>>>>> >> >>>
> >>>>>>> >> >>> Apache Maven 3.2.1
> (ea8b2b07643dbb1b84b6d16e1f08391b666bc1e9;
> >>>>>>> >> >>> 2014-02-14T23:07:52+05:30)
> >>>>>>> >> >>> Maven home: /usr/local/apache-maven/apache-maven-3.2.1
> >>>>>>> >> >>> Java version: 1.7.0, vendor: Oracle Corporation
> >>>>>>> >> >>> Java home: /usr/java/jdk1.7.0/jre
> >>>>>>> >> >>> Default locale: en_US, platform encoding: UTF-8
> >>>>>>> >> >>> OS name: "linux", version: "2.6.32-358.el6.x86_64", arch:
> >>>>>>> "amd64",
> >>>>>>> >> >>> family: "unix"
> >>>>>>> >> >>>
> >>>>>>> >> >>>
> >>>>>>> >> >>> Regards
> >>>>>>> >> >>> Sanjiv Singh
> >>>>>>> >> >>> Mob :  +091 9990-447-339
> >>>>>>> >> >>>
> >>>>>>> >> >>>
> >>>>>>> >> >>> On Fri, Aug 22, 2014 at 8:13 PM, Ted Yu <
> yuzhihong@gmail.com>
> >>>>>>> wrote:
> >>>>>>> >> >>>
> >>>>>>> >> >>>> jamon-maven-plugin is used to generate the XXTmpl.java
> files
> >>>>>>> >> >>>>
> >>>>>>> >> >>>> In a successful build, you would see:
> >>>>>>> >> >>>>
> >>>>>>> >> >>>> [INFO] Source directory:
> >>>>>>> >> >>>> /Users/tyu/trunk/hbase-server/target/generated-jamon added.
> >>>>>>> >> >>>> [INFO] Source directory:
> >>>>>>> >> >>>> /Users/tyu/trunk/hbase-server/target/generated-sources/java
> >>>>>>> added.
> >>>>>>> >> >>>> [INFO]
> >>>>>>> >> >>>> [INFO] --- jamon-maven-plugin:2.3.4:translate (default) @
> >>>>>>> >> hbase-server
> >>>>>>> >> >>>> ---
> >>>>>>> >> >>>> [INFO] Translating 10 templates from
> >>>>>>> >> >>>> /Users/tyu/trunk/hbase-server/src/main/jamon to
> >>>>>>> >> >>>> /Users/tyu/trunk/hbase-server/target/generated-jamon
> >>>>>>> >> >>>> [INFO]
> >>>>>>> >> >>>>
> >>>>>>> >> >>>> What maven version are you using ?
> >>>>>>> >> >>>> Here is the version I use:
> >>>>>>> >> >>>>
> >>>>>>> >> >>>> $ mvn --version
> >>>>>>> >> >>>> Apache Maven 3.0.5
> >>>>>>> (r01de14724cdef164cd33c7c8c2fe155faf9602da;
> >>>>>>> >> >>>> 2013-02-19 05:51:28-0800)
> >>>>>>> >> >>>> Maven home: /Users/tyu/apache-maven-3.0.5
> >>>>>>> >> >>>>
> >>>>>>> >> >>>> Cheers
> >>>>>>> >> >>>>
> >>>>>>> >> >>>>
> >>>>>>> >> >>>> On Fri, Aug 22, 2014 at 1:57 AM, @Sanjiv Singh <
> >>>>>>> >> sanjiv.is.on@gmail.com>
> >>>>>>> >> >>>> wrote:
> >>>>>>> >> >>>>
> >>>>>>> >> >>>>> Thanks for quick response..
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>> Please find attached "compile.log" for logs of command
> "mvn
> >>>>>>> clean
> >>>>>>> >> >>>>> package -DskipTests".
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>> Which clearly say "Building HBase 2.0.0-SNAPSHOT".
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>> Let me know if I am wrong.
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>> Regards
> >>>>>>> >> >>>>> Sanjiv Singh
> >>>>>>> >> >>>>> Mob :  +091 9990-447-339
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>> On Fri, Aug 22, 2014 at 2:20 PM, tobe <
> >>>>>>> tobeg3oogle@gmail.com>
> >>>>>>> >> wrote:
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>>> 2.0.0-SNAPSHOT should be the version of Hadoop, not
> HBase.
> >>>>>>> >> >>>>>>
> >>>>>>> >> >>>>>> Refer to the official guide
> >>>>>>> >> http://hbase.apache.org/book/build.html,
> >>>>>>> >> >>>>>> you should run `mvn clean package -DskipTests` to
> compile.
> >>>>>>> >> >>>>>>
> >>>>>>> >> >>>>>>
> >>>>>>> >> >>>>>> On Fri, Aug 22, 2014 at 4:41 PM, @Sanjiv Singh <
> >>>>>>> >> >>>>>> sanjiv.is.on@gmail.com> wrote:
> >>>>>>> >> >>>>>>
> >>>>>>> >> >>>>>>> HI,
> >>>>>>> >> >>>>>>> Here are details :
> >>>>>>> >> >>>>>>> HBase - 2.0.0-SNAPSHOT (current hbase-master)
> >>>>>>> >> >>>>>>> java version "1.7.0"
> >>>>>>> >> >>>>>>>
> >>>>>>> >> >>>>>>> Regards
> >>>>>>> >> >>>>>>> Sanjiv Singh
> >>>>>>> >> >>>>>>> Mob :  +091 9990-447-339
> >>>>>>> >> >>>>>>>
> >>>>>>> >> >>>>>>>
> >>>>>>> >> >>>>>>> On Fri, Aug 22, 2014 at 12:54 PM, tobe <
> >>>>>>> tobeg3oogle@gmail.com>
> >>>>>>> >> >>>>>>> wrote:
> >>>>>>> >> >>>>>>>
> >>>>>>> >> >>>>>>>> What're the versions of java and hbase?
> >>>>>>> >> >>>>>>>>
> >>>>>>> >> >>>>>>>>
> >>>>>>> >> >>>>>>>> On Fri, Aug 22, 2014 at 2:41 PM, @Sanjiv Singh <
> >>>>>>> >> >>>>>>>> sanjiv.is.on@gmail.com> wrote:
> >>>>>>> >> >>>>>>>>
> >>>>>>> >> >>>>>>>>> Hi All,
> >>>>>>> >> >>>>>>>>> I just started Exploring HBase. I have downloaded
> Hbase
> >>>>>>> master
> >>>>>>> >> >>>>>>>>> source code.
> >>>>>>> >> >>>>>>>>>  While i am trying to compile and build it locally,
> it
> >>>>>>> is
> >>>>>>> >> giving
> >>>>>>> >> >>>>>>>>> me error
> >>>>>>> >> >>>>>>>>> missing some class and packages.
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>> From the source it looks like missing classes are
> >>>>>>> actually not
> >>>>>>> >> >>>>>>>>> written in
> >>>>>>> >> >>>>>>>>> java.  but generated from other way around
> >>>>>>> "RSStatusTmpl.jamon".
> >>>>>>> >> >>>>>>>>> Please
> >>>>>>> >> >>>>>>>>> help me on this to resolve the issue.
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>> >> mvn clean install -DskipTests
> >>>>>>> >> >>>>>>>>> [INFO]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> ------------------------------------------------------------------------
> >>>>>>> >> >>>>>>>>> [INFO] BUILD FAILURE
> >>>>>>> >> >>>>>>>>> [INFO]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> ------------------------------------------------------------------------
> >>>>>>> >> >>>>>>>>> [INFO] Total time: 31.110 s
> >>>>>>> >> >>>>>>>>> [INFO] Finished at: 2014-08-22T12:00:25+05:30
> >>>>>>> >> >>>>>>>>> [INFO] Final Memory: 63M/408M
> >>>>>>> >> >>>>>>>>> [INFO]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> ------------------------------------------------------------------------
> >>>>>>> >> >>>>>>>>> [ERROR] Failed to execute goal
> >>>>>>> >> >>>>>>>>>
> >>>>>>> org.apache.maven.plugins:maven-compiler-plugin:2.5.1:compile
> >>>>>>> >> >>>>>>>>> (default-compile) on project hbase-server: Compilation
> >>>>>>> failure:
> >>>>>>> >> >>>>>>>>> Compilation
> >>>>>>> >> >>>>>>>>> failure:
> >>>>>>> >> >>>>>>>>> [ERROR]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> /home/impadmin/source-code/hbase/hbase-server/src/main/java/org/apache/hadoop/hbase/master/MasterStatusServlet.java:[36,42]
> >>>>>>> >> >>>>>>>>> error: package org.apache.hadoop.hbase.tmpl.master
> does
> >>>>>>> not
> >>>>>>> >> exist
> >>>>>>> >> >>>>>>>>> [ERROR]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> /home/impadmin/source-code/hbase/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/RSStatusServlet.java:[29,48]
> >>>>>>> >> >>>>>>>>> error: package
> >>>>>>> org.apache.hadoop.hbase.tmpl.regionserver does
> >>>>>>> >> not
> >>>>>>> >> >>>>>>>>> exist
> >>>>>>> >> >>>>>>>>> [ERROR]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> /home/impadmin/source-code/hbase/hbase-server/src/main/java/org/apache/hadoop/hbase/master/MasterStatusServlet.java:[75,4]
> >>>>>>> >> >>>>>>>>> error: cannot find symbol
> >>>>>>> >> >>>>>>>>> [ERROR] symbol:   class MasterStatusTmpl
> >>>>>>> >> >>>>>>>>> [ERROR] location: class MasterStatusServlet
> >>>>>>> >> >>>>>>>>> [ERROR]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> /home/impadmin/source-code/hbase/hbase-server/src/main/java/org/apache/hadoop/hbase/master/MasterStatusServlet.java:[75,32]
> >>>>>>> >> >>>>>>>>> error: cannot find symbol
> >>>>>>> >> >>>>>>>>> [ERROR] symbol:   class MasterStatusTmpl
> >>>>>>> >> >>>>>>>>> [ERROR] location: class MasterStatusServlet
> >>>>>>> >> >>>>>>>>> [ERROR]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> /home/impadmin/source-code/hbase/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/RSStatusServlet.java:[51,4]
> >>>>>>> >> >>>>>>>>> error: cannot find symbol
> >>>>>>> >> >>>>>>>>> [ERROR] symbol:   class RSStatusTmpl
> >>>>>>> >> >>>>>>>>> [ERROR] location: class RSStatusServlet
> >>>>>>> >> >>>>>>>>> [ERROR]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >>
> >>>>>>>
> /home/impadmin/source-code/hbase/hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/RSStatusServlet.java:[51,28]
> >>>>>>> >> >>>>>>>>> error: cannot find symbol
> >>>>>>> >> >>>>>>>>> [ERROR] -> [Help 1]
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>> Regards
> >>>>>>> >> >>>>>>>>> Sanjiv Singh
> >>>>>>> >> >>>>>>>>> Mob :  +091 9990-447-339
> >>>>>>> >> >>>>>>>>>
> >>>>>>> >> >>>>>>>>
> >>>>>>> >> >>>>>>>>
> >>>>>>> >> >>>>>>>
> >>>>>>> >> >>>>>>
> >>>>>>> >> >>>>>
> >>>>>>> >> >>>>
> >>>>>>> >> >>>
> >>>>>>> >> >>
> >>>>>>> >> >
> >>>>>>> >>
> >>>>>>> >
> >>>>>>> >
> >>>>>>>
> >>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>>
> >>
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message