hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vikas Parashar <para.vi...@gmail.com>
Subject Re: Repeated Hive start-up issues
Date Fri, 15 May 2015 14:43:30 GMT
Hi Anand,

That depends on issue. You have to understand namenode logs.


Sent from really tiny device :)




On Friday, May 15, 2015, Anand Murali <anand_vihar@yahoo.com> wrote:

> Hi:
>
> Many thanks for replying. Can you please tell me how to fix namenode safe
> mode issue. I am new to Hadoop.
>
> Thanks
>
> Regards
>
> Anand
>
> Sent from my iPhone
>
> On 15-May-2015, at 7:14 pm, Xuefu Zhang <xzhang@cloudera.com
> <javascript:_e(%7B%7D,'cvml','xzhang@cloudera.com');>> wrote:
>
> Your namenode is in safe mode, as the exception shows. You need to
> verify/fix that before trying Hive.
>
> Secondly, "!=" may not work as expected. Try "<>" or other simpler query
> first.
>
> --Xuefu
>
> On Fri, May 15, 2015 at 6:17 AM, Anand Murali <anand_vihar@yahoo.com
> <javascript:_e(%7B%7D,'cvml','anand_vihar@yahoo.com');>> wrote:
>
>> Hi All:
>>
>> I have installed Hadoop-2.6, Hive 1.1 and try to start hive and get the
>> following, first time when I start the cluster
>>
>> $hive
>>
>> Logging initialized using configuration in
>> jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-common-1.1.0.jar!/hive-log4j.properties
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/home/anand_vihar/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>> Exception in thread "main" java.lang.RuntimeException:
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException):
>> Cannot create directory
>> /tmp/hive/anand_vihar/a9d68b70-01b4-4d4d-9d06-1f86efc3b2bc. Name node is in
>> safe mode.
>> The reported blocks 2 has reached the threshold 0.9990 of total blocks 2.
>> The number of live datanodes 1 has reached the minimum number 0. In safe
>> mode extension. Safe mode will be turned off automatically in 13 seconds.
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4216)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4191)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:813)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:600)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
>>
>>     at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472)
>>     at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
>>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>> Caused by:
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException):
>> Cannot create directory
>> /tmp/hive/anand_vihar/a9d68b70-01b4-4d4d-9d06-1f86efc3b2bc. Name node is in
>> safe mode.
>> The reported blocks 2 has reached the threshold 0.9990 of total blocks 2.
>> The number of live datanodes 1 has reached the minimum number 0. In safe
>> mode extension. Safe mode will be turned off automatically in 13 seconds.
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4216)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4191)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:813)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:600)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)
>>
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1468)
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>     at com.sun.proxy.$Proxy13.mkdirs(Unknown Source)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>     at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>     at com.sun.proxy.$Proxy14.mkdirs(Unknown Source)
>>     at
>> org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2753)
>>     at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2724)
>>     at
>> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870)
>>     at
>> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866)
>>     at
>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>     at
>> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866)
>>     at
>> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859)
>>     at
>> org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:584)
>>     at
>> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:526)
>>     at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:458)
>>     ... 8 more
>>
>> *Now, again if I try to start HIVE it works*
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *anand_vihar@Latitude-E5540:~$ hiveLogging initialized using
>> configuration in
>> jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-common-1.1.0.jar!/hive-log4j.propertiesSLF4J:
>> Class path contains multiple SLF4J bindings.SLF4J: Found binding in
>> [jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
>> Found binding in
>> [jar:file:/home/anand_vihar/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J:
>> See http://www.slf4j.org/codes.html#multiple_bindings
>> <http://www.slf4j.org/codes.html#multiple_bindings> for an
>> explanation.SLF4J: Actual binding is of type
>> [org.slf4j.impl.Log4jLoggerFactory]*
>>
>> However, when I start to run SQL commands jline fails as below.
>>
>> select year, MAX(temperature) from records where temperature != 9999
>> group by year;
>> [ERROR] Could not expand event
>> java.lang.IllegalArgumentException: != 9999 group by year;: event not
>> found
>>     at jline.console.ConsoleReader.expandEvents(ConsoleReader.java:779)
>>     at jline.console.ConsoleReader.finishBuffer(ConsoleReader.java:631)
>>     at jline.console.ConsoleReader.accept(ConsoleReader.java:2019)
>>     at jline.console.ConsoleReader.readLine(ConsoleReader.java:2666)
>>     at jline.console.ConsoleReader.readLine(ConsoleReader.java:2269)
>>     at
>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:748)
>>     at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
>>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>>
>>     >
>>
>> Can somebody advise.
>>
>> Thanks
>>
>> Regards;
>>
>>
>>
>>
>>
>> Anand Murali
>> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
>> Chennai - 600 004, India
>> Ph: (044)- 28474593/ 43526162 (voicemail)
>>
>
>

Mime
View raw message