Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 94909178B1 for ; Fri, 15 May 2015 09:24:20 +0000 (UTC) Received: (qmail 24747 invoked by uid 500); 15 May 2015 09:24:15 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 24617 invoked by uid 500); 15 May 2015 09:24:15 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 24598 invoked by uid 99); 15 May 2015 09:24:15 -0000 Received: from Unknown (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 15 May 2015 09:24:15 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 97ECB18283E for ; Fri, 15 May 2015 09:24:14 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.899 X-Spam-Level: ** X-Spam-Status: No, score=2.899 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, RCVD_IN_MSPIKE_H2=-0.001, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id 9l_NgjR3nPjq for ; Fri, 15 May 2015 09:24:10 +0000 (UTC) Received: from mail-ie0-f177.google.com (mail-ie0-f177.google.com [209.85.223.177]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with ESMTPS id E62CB24B39 for ; Fri, 15 May 2015 09:24:09 +0000 (UTC) Received: by iepk2 with SMTP id k2so101159594iep.3 for ; Fri, 15 May 2015 02:22:38 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=uTbm+tuWFkYjIt4Z1aKGj2zfxKWFgOeijaRrpQ8meCY=; b=uljiBYuqzUH2CMvHgneOESxdrnyJ4hBnHFgnCKVHQ/5xUwhUWcvNDEGejKp5kVjurJ pHiPAbv1RRtjVvA5G2I9oo6ycoQERcekN3o3livpsNEBdatT4Q87HqUGsOfkYygJjCmp ec5LgZSyNs520fsavu3kesLxCTpER/7r/KshUCm6dzitqXJfsrppeleMnyxbjri1Ry3Y kRHHT5iAhLYVh+xHieKHrAFR4lU4HPCqxTq+XVQw8Mx3EECguEKKWYk4cZvP3eVQQCw1 L5AKlvBrgipw2WEVG0lPkR70YXsi5DnjgRXvsS1rUFLdfOuAA4yWALuJ0NIoRInYy0cC OLPA== MIME-Version: 1.0 X-Received: by 10.107.132.88 with SMTP id g85mr11203853iod.84.1431681758691; Fri, 15 May 2015 02:22:38 -0700 (PDT) Received: by 10.36.156.68 with HTTP; Fri, 15 May 2015 02:22:38 -0700 (PDT) In-Reply-To: References: <607281154.149320.1431673836663.JavaMail.yahoo@mail.yahoo.com> Date: Fri, 15 May 2015 14:52:38 +0530 Message-ID: Subject: Re: Unable to start Hive From: Vikas Parashar To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a113f323ca8897005161b62cd --001a113f323ca8897005161b62cd Content-Type: text/plain; charset=UTF-8 please send me o/p of below command # hadoop dfsadmin -report On Fri, May 15, 2015 at 2:43 PM, Anand Murali wrote: > Vikas > > Can you be more specific. What to check for in Hive logs. > > Thanks > > Regards > > Anand > > Sent from my iPhone > > On 15-May-2015, at 2:41 pm, Vikas Parashar wrote: > > Hi Anand, > > It seems your namenode is working fine. I can't see any "safemode" related > logs in your namenode file. Kindly check it hive logs as well. > > On Fri, May 15, 2015 at 12:40 PM, Anand Murali > wrote: > >> Vikas: >> >> Please find attached. At this time I would like to tell you that with the >> current installation, I am able to run mapreduce jobs and pig scripts >> without any installation errors. So please, any suggestions made should not >> break and cascade other installations. >> >> Thanks >> >> Regards, >> >> Anand Murali >> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >> Chennai - 600 004, India >> Ph: (044)- 28474593/ 43526162 (voicemail) >> >> >> >> On Friday, May 15, 2015 12:31 PM, Kiran Dangeti < >> kirandkumar2013@gmail.com> wrote: >> >> >> Anand, >> Sometimes it will error out due some resources are not available. So stop >> and start the hadoop cluster and see >> On May 15, 2015 12:24 PM, "Anand Murali" wrote: >> >> Dear All: >> >> I am running Hadoop-2.6 (pseudo mode) on Ubuntu 15.04, and trying to >> connect Hive to it after installation. I run . .hadoop as start-up script >> which contain environment variables setting. Find below >> >> *. ,hadoop* >> export HADOOP_HOME=/home/anand_vihar/hadoop-2.6.0 >> export JAVA_HOME=/home/anand_vihar/jdk1.7.0_75/ >> export HADOOP\_PREFIX=/home/anand_vihar/hadoop-2.6.0 >> export HADOOP_INSTALL=/home/anand_vihar/hadoop-2.6.0 >> export PIG_HOME=/home/anand_vihar/pig-0.14.0 >> export PIG_INSTALL=/home/anand_vihar/pig-0.14.0 >> export PIG_CLASSPATH=/home/anand_vihar/hadoop-2.6.0/etc/hadoop/ >> export HIVE_HOME=/home/anand_vihar/hive-1.1.0 >> export HIVE_INSTALL=/home/anand_vihar/hive-1.1.0 >> export >> PATH=$PATH:$HADOOP_INSTALL/bin:$HADOOP_INSTALL/sbin:$HADOOP_HOME:$JAVA_HOME:$PIG_INSTALL/bin:$PIG_CLASSPATH:$HIVE_HOME:$HIVE_INSTALL/bin >> echo $HADOOP_HOME >> echo $JAVA_HOME >> echo $HADOOP_INSTALL >> echo $PIG_HOME >> echo $PIG_INSTALL >> echo $PIG_CLASSPATH >> echo $HIVE_HOME >> echo $PATH >> >> >> *Error* >> >> anand_vihar@Latitude-E5540:~$ hive >> >> Logging initialized using configuration in >> jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-common-1.1.0.jar!/hive-log4j.properties >> SLF4J: Class path contains multiple SLF4J bindings. >> SLF4J: Found binding in >> [jar:file:/home/anand_vihar/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] >> SLF4J: Found binding in >> [jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class] >> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >> explanation. >> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] >> Exception in thread "main" java.lang.RuntimeException: >> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): >> Cannot create directory >> /tmp/hive/anand_vihar/a9eb2cf7-9890-4ec3-af6c-ae0c40d9e9d7. Name node is in >> safe mode. >> The reported blocks 2 has reached the threshold 0.9990 of total blocks 2. >> The number of live datanodes 1 has reached the minimum number 0. In safe >> mode extension. Safe mode will be turned off automatically in 6 seconds. >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4216) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4191) >> at >> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:813) >> at >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:600) >> at >> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619) >> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:415) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) >> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) >> >> at >> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472) >> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671) >> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at org.apache.hadoop.util.RunJar.run(RunJar.java:221) >> at org.apache.hadoop.util.RunJar.main(RunJar.java:136) >> Caused by: >> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): >> Cannot create directory >> /tmp/hive/anand_vihar/a9eb2cf7-9890-4ec3-af6c-ae0c40d9e9d7. Name node is in >> safe mode. >> The reported blocks 2 has reached the threshold 0.9990 of total blocks 2. >> The number of live datanodes 1 has reached the minimum number 0. In safe >> mode extension. Safe mode will be turned off automatically in 6 seconds. >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4216) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4191) >> at >> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:813) >> at >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:600) >> at >> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619) >> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:415) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) >> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) >> >> at org.apache.hadoop.ipc.Client.call(Client.java:1468) >> at org.apache.hadoop.ipc.Client.call(Client.java:1399) >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) >> at com.sun.proxy.$Proxy13.mkdirs(Unknown Source) >> at >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at >> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) >> at >> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) >> at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) >> at >> org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2753) >> at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2724) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866) >> at >> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859) >> at >> org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:584) >> at >> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:526) >> at >> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:458) >> ... 8 more >> >> Can somebody advise. >> >> Thanks >> >> Anand Murali >> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >> Chennai - 600 004, India >> Ph: (044)- 28474593/ 43526162 (voicemail) >> >> >> >> > --001a113f323ca8897005161b62cd Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
please send me o/p of below command

# hadoop dfsadmin -report


On Fri, May 15, = 2015 at 2:43 PM, Anand Murali <anand_vihar@yahoo.com> wr= ote:
Vikas

Can you be more specific. What to check for in Hive logs.=

Thanks

Regards

Anand

Sent from my iPhone

On 15-May-2015, at 2:41 pm, Vikas Parashar <para.vikas@gmail.com> wrote= :

Hi Anand,=C2= =A0

It seems your namenode is working fine. I can't = see any "safemode" related logs in your namenode file. Kindly che= ck it hive logs as well.

On Fri, May 15, 2015 at 12:40 PM, Anand Murali <ana= nd_vihar@yahoo.com> wrote:
=
=
Vikas:

Please find attached. At this time I w= ould like to tell you that with the current installation, I am able to run = mapreduce jobs and pig scripts without any installation errors. So please, = any suggestions made should not break and cascade other installations.

Thanks

Regards,
=C2=A0
Anand Murali=C2=A0=C2=A0
11/= 7, 'Anand Vihar', Kandasamy St, Mylapore
Chennai - 600 004, India
Ph: (044= )- 28474593/=C2=A043526162 (voicemail)


On Friday, May 15, 2015 12:31 PM, Kiran Dang= eti <kira= ndkumar2013@gmail.com> wrote:


Anand,
Sometimes it will error out due some resources are not ava= ilable. So stop and start the hadoop cluster and see
On May 15, 2015 12:24 PM, "Anand Murali" <anand_vihar@yahoo.com> wrote:
=
= Dear All:

I am running = Hadoop-2.6 (pseudo mode) on Ubuntu 15.04, and trying to connect Hive to it = after installation. I run . .hadoop as start-up script which contain enviro= nment variables setting. Find below

. ,hadoop
export HADO= OP_HOME=3D/home/anand_vihar/hadoop-2.6.0
export JAVA_HOME= =3D/home/anand_vihar/jdk1.7.0_75/
export HADOOP\_PREFIX= =3D/home/anand_vihar/hadoop-2.6.0
export HADOOP_INSTALL= =3D/home/anand_vihar/hadoop-2.6.0
export PIG_HOME=3D/home= /anand_vihar/pig-0.14.0
export PIG_INSTALL=3D/home/anand_= vihar/pig-0.14.0
export PIG_CLASSPATH=3D/home/anand_vihar= /hadoop-2.6.0/etc/hadoop/
export HIVE_HOME=3D/home/anand_= vihar/hive-1.1.0
export HIVE_INSTALL=3D/home/anand_vihar/= hive-1.1.0
export PATH=3D$PATH:$HADOOP_INSTALL/bin:$HADOO= P_INSTALL/sbin:$HADOOP_HOME:$JAVA_HOME:$PIG_INSTALL/bin:$PIG_CLASSPATH:$HIV= E_HOME:$HIVE_INSTALL/bin
echo $HADOOP_HOME
echo $JAVA_HOME
echo $HADOOP_INSTALL
echo $PIG_HOME
echo $PIG_INSTALL
echo= $PIG_CLASSPATH
echo $HIVE_HOME
echo $P= ATH


Error

anand_vihar@Latitude-E5540:~$ hive

Logging initialized using configuration i= n jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-common-1.1.0.jar!/hive-log= 4j.properties
SLF4J: Class path contains multiple SLF4J b= indings.
SLF4J: Found binding in [jar:file:/home/anand_vi= har/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j= /impl/StaticLoggerBinder.class]
SLF4J: Found binding in [= jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-jdbc-1.1.0-standalone.jar!/o= rg/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bin= dings for an explanation.
SLF4J: Actual binding is of= type [org.slf4j.impl.Log4jLoggerFactory]
Exception in th= read "main" java.lang.RuntimeException: org.apache.hadoop.ipc.Rem= oteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Can= not create directory /tmp/hive/anand_vihar/a9eb2cf7-9890-4ec3-af6c-ae0c40d9= e9d7. Name node is in safe mode.
The reported blocks 2 ha= s reached the threshold 0.9990 of total blocks 2. The number of live datano= des 1 has reached the minimum number 0. In safe mode extension. Safe mode w= ill be turned off automatically in 6 seconds.
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameN= odeSafeMode(FSNamesystem.java:1364)
=C2=A0=C2=A0=C2=A0 at= org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem= .java:4216)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.= server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4191)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpc= Server.mkdirs(NameNodeRpcServer.java:813)
=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTr= anslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:600)=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocol.proto= .ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(C= lientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at o= rg.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Proto= bufRpcEngine.java:619)
=C2=A0=C2=A0=C2=A0 at org.apache.h= adoop.ipc.RPC$Server.call(RPC.java:962)
=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run= (Server.java:2035)
=C2=A0=C2=A0=C2=A0 at java.security.Ac= cessController.doPrivileged(Native Method)
=C2=A0=C2=A0= =C2=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformation.d= oAs(UserGroupInformation.java:1628)
=C2=A0=C2=A0=C2=A0 at= org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033)

=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.sessi= on.SessionState.start(SessionState.java:472)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.cli.CliDriver.ma= in(CliDriver.java:615)
=C2=A0=C2=A0=C2=A0 at sun.reflect.= NativeMethodAccessorImpl.invoke0(Native Method)
=C2=A0=C2= =A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccess= orImpl.java:57)
=C2=A0=C2=A0=C2=A0 at sun.reflect.Delegat= ingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java= :606)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.util.RunJar= .run(RunJar.java:221)
=C2=A0=C2=A0=C2=A0 at org.apache.ha= doop.util.RunJar.main(RunJar.java:136)
Caused by: org.apa= che.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeM= odeException): Cannot create directory /tmp/hive/anand_vihar/a9eb2cf7-9890-= 4ec3-af6c-ae0c40d9e9d7. Name node is in safe mode.
The re= ported blocks 2 has reached the threshold 0.9990 of total blocks 2. The num= ber of live datanodes 1 has reached the minimum number 0. In safe mode exte= nsion. Safe mode will be turned off automatically in 6 seconds.
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNames= ystem.checkNameNodeSafeMode(FSNamesystem.java:1364)
=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdi= rsInt(FSNamesystem.java:4216)
=C2=A0=C2=A0=C2=A0 at org.a= pache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:419= 1)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.na= menode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:813)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodePr= otocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTransla= torPB.java:600)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.h= dfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.ca= llBlockingMethod(ClientNamenodeProtocolProtos.java)
=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufR= pcInvoker.call(ProtobufRpcEngine.java:619)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Serve= r.java:2039)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.= Server$Handler$1.run(Server.java:2035)
=C2=A0=C2=A0=C2=A0= at java.security.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:4= 15)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.User= GroupInformation.doAs(UserGroupInformation.java:1628)
=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:203= 3)

=C2=A0=C2=A0=C2=A0 at org.apache.ha= doop.ipc.Client.call(Client.java:1468)
=C2=A0=C2=A0=C2=A0= at org.apache.hadoop.ipc.Client.call(Client.java:1399)
= =C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invok= e(ProtobufRpcEngine.java:232)
=C2=A0=C2=A0=C2=A0 at com.s= un.proxy.$Proxy13.mkdirs(Unknown Source)
=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB= .mkdirs(ClientNamenodeProtocolTranslatorPB.java:539)
=C2= =A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccesso= rImpl.invoke(NativeMethodAccessorImpl.java:57)
=C2=A0=C2= =A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMeth= odAccessorImpl.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.r= eflect.Method.invoke(Method.java:606)
=C2=A0=C2=A0=C2=A0 = at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvo= cationHandler.java:187)
=C2=A0=C2=A0=C2=A0 at org.apache.= hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:1= 02)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy14.mkdirs(U= nknown Source)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hd= fs.DFSClient.primitiveMkdir(DFSClient.java:2753)
=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2724= )
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.Distribute= dFileSystem$17.doCall(DistributedFileSystem.java:870)
=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(D= istributedFileSystem.java:866)
=C2=A0=C2=A0=C2=A0 at org.= apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java= :81)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.Distrib= utedFileSystem.mkdirsInternal(DistributedFileSystem.java:866)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DistributedFileSystem.mkd= irs(DistributedFileSystem.java:859)
=C2=A0=C2=A0=C2=A0 at= org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.jav= a:584)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.se= ssion.SessionState.createSessionDirs(SessionState.java:526)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.session.SessionState.sta= rt(SessionState.java:458)
=C2=A0=C2=A0=C2=A0 ... 8 more

Can somebody advise.

Thanks
<= div>=C2=A0
Anand Murali= =C2=A0=C2=A0
11/7, 'Anand Vihar',= Kandasamy St, Mylapore
Chennai - 600 004= , India
Ph: (044)- 28474593/=C2=A04352616= 2 (voicemail)
=




--001a113f323ca8897005161b62cd--