hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From stack <st...@duboce.net>
Subject Re: HBase Startup Errors
Date Thu, 21 Feb 2008 21:53:33 GMT
Chris Richard wrote:
> API question this time: I'm interested in storing entities with multi-valued
> properties. Is this possible?
>   

Effectively, HBase only speaks bytes.  Your client inserting will need 
to somehow serialize your multi-valued entity into bytes for insertion 
into an hbase cell.
St.Ack


> On Thu, Feb 21, 2008 at 2:59 PM, Chris Richard <chris.richard@gmail.com>
> wrote:
>
>   
>> I turned off hdfs permissions completely by making an edit to the hadoop
>> config. Seems to be working now!
>>
>> Thanks,
>>
>> Chris
>>
>>
>> On Thu, Feb 21, 2008 at 2:34 PM, stack <stack@duboce.net> wrote:
>>
>>     
>>> The exception is from hdfs:
>>>
>>> Caused by: org.apache.hadoop.fs.permission.AccessControlException:
>>> org.apache.hadoop.fs.permission.AccessControlException: Permission
>>>
>>> denied:
>>>
>>>       
>>>> user=dr-dev\sshd_server, access=WRITE,
>>>> inode="hbase":dr-dev\administrator:supergroup:rwxr-xr-x
>>>>         
>>> Seems to be saying that user 'dr-dev\sshd-server' (?) is not allowed
>>> write to the 'hbase' dir.
>>>
>>> I'd tell you how to fix that if I knew how but haven't yet spent time on
>>> hdfs permission stuff (Looks like it will be hard to hold out much
>>> longer.  Smile).
>>>
>>> St.Ack
>>>
>>>
>>> Chris Richard wrote:
>>>       
>>>> Sorry, I'm a bit unclear - do I need to configure permissions on dfs
>>>> somehow, use an older version of hadoop, or make a code change to
>>>>         
>>> hbase?
>>>       
>>>> On Thu, Feb 21, 2008 at 1:18 PM, stack <stack@duboce.net> wrote:
>>>>
>>>>
>>>>         
>>>>> You've tripped over the new access control feature of hdfs (you must
>>>>>           
>>> be
>>>       
>>>>> using 0.16.0 release?): hbase is not able to write the hdfs.  Can you
>>>>> fix that?
>>>>> St.Ack
>>>>>
>>>>> Chris Richard wrote:
>>>>>
>>>>>           
>>>>>> OK, looks like the HMaster starts up OK, but the RegionServer is
>>>>>>             
>>> having
>>>       
>>>>>> problems related to access control:
>>>>>>
>>>>>> HMaster Logs:
>>>>>>
>>>>>> 2008-02-21 12:55:51,343 INFO org.apache.hadoop.hbase.master.HMaster:
>>>>>>
>>>>>>             
>>>>> Root
>>>>>
>>>>>           
>>>>>> region dir: hdfs://localhost:9000/hbase/-ROOT-/70236052
>>>>>> 2008-02-21 12:55:51,500 INFO
>>>>>>             
>>> org.apache.hadoop.metrics.jvm.JvmMetrics:
>>>       
>>>>>> Initializing RPC Metrics with serverName=60000, port=60000
>>>>>> 2008-02-21 12:55:51,578 INFO org.apache.hadoop.hbase.master.HMaster:
>>>>>>
>>>>>>             
>>>>> HMaster
>>>>>
>>>>>           
>>>>>> initialized on 127.0.0.1:60000
>>>>>> 2008-02-21 12:55:51,625 INFO org.mortbay.util.Credential: Checking
>>>>>>
>>>>>>             
>>>>> Resource
>>>>>
>>>>>           
>>>>>> aliases
>>>>>> 2008-02-21 12:55:51,656 INFO org.mortbay.http.HttpServer: Version
>>>>>> Jetty/5.1.4
>>>>>> 2008-02-21 12:55:51,875 INFO org.mortbay.util.Container: Started
>>>>>> org.mortbay.jetty.servlet.WebApplicationHandler@134ce4a
>>>>>> 2008-02-21 12:55:51,921 INFO org.mortbay.util.Container: Started
>>>>>> WebApplicationContext[/,/]
>>>>>> 2008-02-21 12:55:52,078 INFO org.mortbay.util.Container: Started
>>>>>> org.mortbay.jetty.servlet.WebApplicationHandler@83b1b
>>>>>> 2008-02-21 12:55:52,078 INFO org.mortbay.util.Container: Started
>>>>>> WebApplicationContext[/api,rest]
>>>>>> 2008-02-21 12:55:52,078 INFO org.mortbay.util.Container: Started
>>>>>> HttpContext[/static,/static]
>>>>>> 2008-02-21 12:55:52,093 INFO org.mortbay.http.SocketListener:
>>>>>>             
>>> Started
>>>       
>>>>>> SocketListener on 0.0.0.0:60010
>>>>>> 2008-02-21 12:55:52,093 INFO org.mortbay.util.Container: Started
>>>>>> org.mortbay.jetty.Server@1aae94f
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> listener on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 1 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 3 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 7 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 5 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 9 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> Responder: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 2 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 0 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 6 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 4 on 60000: starting
>>>>>> 2008-02-21 12:55:52,093 INFO org.apache.hadoop.ipc.Server: IPC
>>>>>>             
>>> Server
>>>       
>>>>>> handler 8 on 60000: starting
>>>>>> 2008-02-21 12:55:54,406 INFO org.apache.hadoop.hbase.master.HMaster:
>>>>>> received start message from: 192.168.0.102:60020
>>>>>> 2008-02-21 12:56:24,406 INFO org.apache.hadoop.hbase.master.HMaster:
>>>>>> 192.168.0.102:60020 lease expired
>>>>>>
>>>>>> RegionServer Logs:
>>>>>> 2008-02-21 12:55:54,328 INFO
>>>>>>             
>>> org.apache.hadoop.metrics.jvm.JvmMetrics:
>>>       
>>>>>> Initializing RPC Metrics with serverName=60020, port=60020
>>>>>> 2008-02-21 12:55:54,421 INFO org.apache.hadoop.fs.FileSystem:
>>>>>>             
>>> FileSystem
>>>       
>>>>>> name is: hdfs://localhost:9000/hbase
>>>>>>
>>>>>> 2008-02-21 12:55:54,703 FATAL org.apache.hadoop.hbase.HRegionServer:
>>>>>>
>>>>>>             
>>>>> Failed
>>>>>
>>>>>           
>>>>>> init
>>>>>> org.apache.hadoop.fs.permission.AccessControlException:
>>>>>> org.apache.hadoop.fs.permission.AccessControlException: Permission
>>>>>>
>>>>>>             
>>>>> denied:
>>>>>
>>>>>           
>>>>>> user=dr-dev\sshd_server, access=WRITE,
>>>>>> inode="hbase":dr-dev\administrator:supergroup:rwxr-xr-x
>>>>>>     at org.apache.hadoop.dfs.PermissionChecker.check(
>>>>>>
>>>>>>             
>>>>> PermissionChecker.java
>>>>>
>>>>>           
>>>>>> :173)
>>>>>>     at org.apache.hadoop.dfs.PermissionChecker.check(
>>>>>>
>>>>>>             
>>>>> PermissionChecker.java
>>>>>
>>>>>           
>>>>>> :154)
>>>>>>     at org.apache.hadoop.dfs.PermissionChecker.checkPermission(
>>>>>> PermissionChecker.java:102)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.checkPermission(
>>>>>>
>>>>>>             
>>>>> FSNamesystem.java
>>>>>
>>>>>           
>>>>>> :4035)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.checkAncestorAccess(
>>>>>> FSNamesystem.java:4005)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.mkdirsInternal(
>>>>>>
>>>>>>             
>>>>> FSNamesystem.java
>>>>>
>>>>>           
>>>>>> :1558)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.mkdirs(FSNamesystem.java
>>>>>>             
>>> :1541)
>>>       
>>>>>>     at org.apache.hadoop.dfs.NameNode.mkdirs(NameNode.java:422)
>>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(
>>>>>> NativeMethodAccessorImpl.java:39)
>>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>>>>> DelegatingMethodAccessorImpl.java:25)
>>>>>>     at java.lang.reflect.Method.invoke(Method.java:585)
>>>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:409)
>>>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:899)
>>>>>>
>>>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>>
>>>>>>             
>>>>> Method)
>>>>>
>>>>>           
>>>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(
>>>>>> NativeConstructorAccessorImpl.java:39)
>>>>>>     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
>>>>>> DelegatingConstructorAccessorImpl.java:27)
>>>>>>     at java.lang.reflect.Constructor.newInstance(Constructor.java
>>>>>>             
>>> :494)
>>>       
>>>>>>     at
>>>>>>
>>>>>>             
>>>>> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(
>>>>>
>>>>>           
>>>>>> RemoteExceptionHandler.java:82)
>>>>>>     at
>>>>>>             
>>> org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(
>>>       
>>>>>> RemoteExceptionHandler.java:48)
>>>>>>     at org.apache.hadoop.hbase.HRegionServer.init(HRegionServer.java
>>>>>>
>>>>>>             
>>>>> :883)
>>>>>
>>>>>           
>>>>>>     at org.apache.hadoop.hbase.HRegionServer.run(HRegionServer.java
>>>>>>             
>>> :644)
>>>       
>>>>>>     at java.lang.Thread.run(Thread.java:595)
>>>>>> 2008-02-21 12:55:54,718 FATAL org.apache.hadoop.hbase.HRegionServer:
>>>>>> Unhandled exception. Aborting...
>>>>>> java.io.IOException: region server startup failed
>>>>>>     at org.apache.hadoop.hbase.HRegionServer.init(HRegionServer.java
>>>>>>
>>>>>>             
>>>>> :885)
>>>>>
>>>>>           
>>>>>>     at org.apache.hadoop.hbase.HRegionServer.run(HRegionServer.java
>>>>>>             
>>> :644)
>>>       
>>>>>>     at java.lang.Thread.run(Thread.java:595)
>>>>>> Caused by: org.apache.hadoop.fs.permission.AccessControlException:
>>>>>> org.apache.hadoop.fs.permission.AccessControlException: Permission
>>>>>>
>>>>>>             
>>>>> denied:
>>>>>
>>>>>           
>>>>>> user=dr-dev\sshd_server, access=WRITE,
>>>>>> inode="hbase":dr-dev\administrator:supergroup:rwxr-xr-x
>>>>>>     at org.apache.hadoop.dfs.PermissionChecker.check(
>>>>>>
>>>>>>             
>>>>> PermissionChecker.java
>>>>>
>>>>>           
>>>>>> :173)
>>>>>>     at org.apache.hadoop.dfs.PermissionChecker.check(
>>>>>>
>>>>>>             
>>>>> PermissionChecker.java
>>>>>
>>>>>           
>>>>>> :154)
>>>>>>     at org.apache.hadoop.dfs.PermissionChecker.checkPermission(
>>>>>> PermissionChecker.java:102)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.checkPermission(
>>>>>>
>>>>>>             
>>>>> FSNamesystem.java
>>>>>
>>>>>           
>>>>>> :4035)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.checkAncestorAccess(
>>>>>> FSNamesystem.java:4005)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.mkdirsInternal(
>>>>>>
>>>>>>             
>>>>> FSNamesystem.java
>>>>>
>>>>>           
>>>>>> :1558)
>>>>>>     at org.apache.hadoop.dfs.FSNamesystem.mkdirs(FSNamesystem.java
>>>>>>             
>>> :1541)
>>>       
>>>>>>     at org.apache.hadoop.dfs.NameNode.mkdirs(NameNode.java:422)
>>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(
>>>>>> NativeMethodAccessorImpl.java:39)
>>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>>>>> DelegatingMethodAccessorImpl.java:25)
>>>>>>     at java.lang.reflect.Method.invoke(Method.java:585)
>>>>>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:409)
>>>>>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:899)
>>>>>>
>>>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>>>>>
>>>>>>             
>>>>> Method)
>>>>>
>>>>>           
>>>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(
>>>>>> NativeConstructorAccessorImpl.java:39)
>>>>>>     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
>>>>>> DelegatingConstructorAccessorImpl.java:27)
>>>>>>     at java.lang.reflect.Constructor.newInstance(Constructor.java
>>>>>>             
>>> :494)
>>>       
>>>>>>     at
>>>>>>
>>>>>>             
>>>>> org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(
>>>>>
>>>>>           
>>>>>> RemoteExceptionHandler.java:82)
>>>>>>     at
>>>>>>             
>>> org.apache.hadoop.hbase.RemoteExceptionHandler.checkIOException(
>>>       
>>>>>> RemoteExceptionHandler.java:48)
>>>>>>     at org.apache.hadoop.hbase.HRegionServer.init(HRegionServer.java
>>>>>>
>>>>>>             
>>>>> :883)
>>>>>
>>>>>           
>>>>>>     ... 2 more
>>>>>> 2008-02-21 12:55:54,718 INFO org.apache.hadoop.ipc.Server: Stopping
>>>>>>
>>>>>>             
>>>>> server
>>>>>
>>>>>           
>>>>>> on 60020
>>>>>> 2008-02-21 12:55:54,718 INFO org.apache.hadoop.hbase.HRegionServer:
>>>>>>
>>>>>>             
>>>>> Starting
>>>>>
>>>>>           
>>>>>> shutdown thread.
>>>>>> 2008-02-21 12:55:54,718 INFO org.apache.hadoop.hbase.HRegionServer:
>>>>>>
>>>>>>             
>>>>> Shutdown
>>>>>
>>>>>           
>>>>>> thread complete
>>>>>>
>>>>>>
>>>>>> On Thu, Feb 21, 2008 at 12:07 PM, stack <stack@duboce.net>
wrote:
>>>>>>
>>>>>>
>>>>>>
>>>>>>             
>>>>>>> Chris Richard wrote:
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>> Not quite sure how to 'enable DEBUG'.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>                 
>>>>>>> Pardon me.  I should have included the following pointer:
>>>>>>> http://wiki.apache.org/hadoop/Hbase/FAQ#4.
>>>>>>> St.Ack
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>> On Wed, Feb 20, 2008 at 10:34 PM, stack <stack@duboce.net>
wrote:
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>                 
>>>>>>>>> Enable DEBUG and then check out the master and regionserver
logs.
>>>>>>>>>                   
>>>  If
>>>       
>>>>>>>>> you can't figure it, paste what you think relevant sections
into
>>>>>>>>>                   
>>> mail
>>>       
>>>>>>>>> and we'll have a look.
>>>>>>>>> Thanks Chris,
>>>>>>>>> St.Ack
>>>>>>>>>
>>>>>>>>> Chris Richard wrote:
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                   
>>>>>>>>>> Follow up to this. I removed the http:// from the
>>>>>>>>>>                     
>>> hbase.mastersetting
>>>       
>>>>>>>>>>
>>>>>>>>>>                     
>>>>>>>>> and
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                   
>>>>>>>>>> hbase seems to start successfully. Unfortunately
I get the
>>>>>>>>>>                     
>>> following
>>>       
>>>>>>>>>>
>>>>>>>>>>                     
>>>>>>>>> error
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                   
>>>>>>>>>> on the web UI and upon running any query in the shell:
>>>>>>>>>>
>>>>>>>>>> org.apache.hadoop.hbase.NoServerForRegionException:
Timed out
>>>>>>>>>>                     
>>> trying
>>>       
>>>>>>>>>>                     
>>>>>>> to
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>> locate root region
>>>>>>>>>>
>>>>>>>>>> On Thu, Feb 21, 2008 at 12:40 AM, Chris Richard <
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>                     
>>>>>>> chris.richard@gmail.com
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>                     
>>>>>>>>>>> Hi All,
>>>>>>>>>>>
>>>>>>>>>>> I've got hdfs (single node) running successfully,
as far as I
>>>>>>>>>>>                       
>>> can
>>>       
>>>>>>>>>>>                       
>>>>>>> tell.
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>> Unfortunately hbase is exceptioning on startup
- probably a
>>>>>>>>>>>                       
>>> config
>>>       
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>>>>>> issue but
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                   
>>>>>>>>>>> I can't figure out what.
>>>>>>>>>>>
>>>>>>>>>>> I'm using the latest hadoop from the trunk, likewise
for hbase,
>>>>>>>>>>>                       
>>> and
>>>       
>>>>>>>>>>>                       
>>>>>>> I'm
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>> running on Win2k3 with Cygwin. Following are
my hadoop-site.xmland
>>>>>>>>>>> hbase-site.xml:
>>>>>>>>>>>
>>>>>>>>>>> hadoop
>>>>>>>>>>> <configuration>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>hadoop.tmp.dir</name>
>>>>>>>>>>>   <value>/cygdrive/d/datarepublik/tmp/hadoop-${user.name
>>>>>>>>>>>                       
>>> }</value>
>>>       
>>>>>>>>>>> </property>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>fs.default.name</name>
>>>>>>>>>>>   <value>localhost:9001</value>
>>>>>>>>>>> </property>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>mapred.job.tracker</name>
>>>>>>>>>>>   <value>localhost:9002</value>
>>>>>>>>>>> </property>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>dfs.replication</name>
>>>>>>>>>>>   <value>1</value>
>>>>>>>>>>> </property>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>mapred.child.java.opts</name>
>>>>>>>>>>>   <value>-Xmx512m</value>
>>>>>>>>>>> </property>
>>>>>>>>>>> </configuration>
>>>>>>>>>>>
>>>>>>>>>>> hbase
>>>>>>>>>>>  <configuration>
>>>>>>>>>>>    <property>
>>>>>>>>>>>      <name>hbase.master</name>
>>>>>>>>>>>      <value>http://localhost:60000</value>
>>>>>>>>>>>      <description>The host and port that
the HBase master runs
>>>>>>>>>>> at</description>
>>>>>>>>>>>    </property>
>>>>>>>>>>>    <property>
>>>>>>>>>>>      <name>hbase.rootdir</name>
>>>>>>>>>>>      <value>hdfs://localhost:9001/hbase</value>
>>>>>>>>>>>      <description>The directory shared
by region
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>>>> servers</description>
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>>    </property>
>>>>>>>>>>>  </configuration>
>>>>>>>>>>>
>>>>>>>>>>> The exception I get is:
>>>>>>>>>>>
>>>>>>>>>>> ...
>>>>>>>>>>> 2008-02-21 00:33:09,906 WARN
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>> org.apache.hadoop.util.NativeCodeLoader:
>>>>>
>>>>>           
>>>>>>>>>>> Unable to load native-hadoop library for your
platform... using
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>>>>>> builtin-java
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                   
>>>>>>>>>>> classes where applicable
>>>>>>>>>>> 2008-02-21 00:33:10,000 INFO org.apache.hadoop.hbase.HRegion:
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>> closed
>>>>>
>>>>>           
>>>>>>>>>>> -ROOT-,,0
>>>>>>>>>>> 2008-02-21 00:33:10,453 INFO org.apache.hadoop.hbase.HRegion:
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>> closed
>>>>>
>>>>>           
>>>>>>>>>>> .META.,,1
>>>>>>>>>>> 2008-02-21 00:33:10,500 ERROR
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>> org.apache.hadoop.hbase.master.HMaster:
>>>>>
>>>>>           
>>>>>>>>> Can
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                   
>>>>>>>>>>> not start master
>>>>>>>>>>> java.lang.reflect.InvocationTargetException
>>>>>>>>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>> (Native
>>>>>
>>>>>           
>>>>>>>>>>> Method)
>>>>>>>>>>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(
>>>>>>>>>>> NativeConstructorAccessorImpl.java:39)
>>>>>>>>>>>     at
>>>>>>>>>>>                       
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
>>>       
>>>>>>>>>>> DelegatingConstructorAccessorImpl.java:27)
>>>>>>>>>>>     at java.lang.reflect.Constructor.newInstance(
>>>>>>>>>>>                       
>>> Constructor.java
>>>       
>>>>>>>>>>>                       
>>>>>>> :494)
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>>     at org.apache.hadoop.hbase.master.HMaster.doMain(
>>>>>>>>>>>                       
>>> HMaster.java
>>>       
>>>>>>>>>>>                       
>>>>>>> :1540)
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>>     at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>> :1574)
>>>>>
>>>>>           
>>>>>>>>>>> Caused by: java.lang.NullPointerException
>>>>>>>>>>>     at org.apache.hadoop.hbase.HServerAddress.getBindAddress(
>>>>>>>>>>> HServerAddress.java:94)
>>>>>>>>>>>     at org.apache.hadoop.hbase.master.HMaster.<init>(
>>>>>>>>>>>                       
>>> HMaster.java
>>>       
>>>>>>>>>>>                       
>>>>>>> :319)
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>>     at org.apache.hadoop.hbase.master.HMaster.<init>(
>>>>>>>>>>>                       
>>> HMaster.java
>>>       
>>>>>>>>>>>                       
>>>>>>> :242)
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>>     ... 6 more
>>>>>>>>>>>
>>>>>>>>>>> I'm a bit of a Java n00b so I couldn't figure
out how to attach
>>>>>>>>>>>                       
>>> to
>>>       
>>>>>>>>>>>                       
>>>>>>> the
>>>>>>>
>>>>>>>
>>>>>>>               
>>>>>>>>>>> process in time to debug the error. Any help
would be greatly
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>>>>>> appreciated.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>                   
>>>>>>>>>>> Thanks.
>>>>>>>>>>>
>>>>>>>>>>> Chris Richard
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>                       
>>>>         
>>>       
>
>   


Mime
View raw message