Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1C0CE178B9 for ; Fri, 15 May 2015 14:43:49 +0000 (UTC) Received: (qmail 7782 invoked by uid 500); 15 May 2015 14:43:47 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 7713 invoked by uid 500); 15 May 2015 14:43:47 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 7702 invoked by uid 99); 15 May 2015 14:43:47 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 15 May 2015 14:43:47 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 9E5B21A2D04 for ; Fri, 15 May 2015 14:43:46 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.9 X-Spam-Level: *** X-Spam-Status: No, score=3.9 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_REPLY=1, HTML_MESSAGE=3, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-west.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id iWFtwzAo8DBC for ; Fri, 15 May 2015 14:43:38 +0000 (UTC) Received: from mail-ig0-f177.google.com (mail-ig0-f177.google.com [209.85.213.177]) by mx1-us-west.apache.org (ASF Mail Server at mx1-us-west.apache.org) with ESMTPS id B80D424BB7 for ; Fri, 15 May 2015 14:43:37 +0000 (UTC) Received: by igcau1 with SMTP id au1so28220888igc.1 for ; Fri, 15 May 2015 07:43:30 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=1KmIpFuWEpwfR1bCAnkV4IjhTdXAEeUXLU6Hjmvn/CA=; b=VEOgsX1k6LIIGf5k40yz1C1PZ4g6M/DllPgias3SWCI9gw4f1M1X/WZtRjIbMtblIX 2oUiocKwK7Uh/J4SOdgHjyLDHFBpKaAIVUmM4jZWSSnDusgoxoVm6l1AB9bZtPZsflni Euj1a5fWbyil9F7pRohTWm6wCoq8geNx8l46DV43aIAb9TiWSEVOE0fT48dAnLZ2LtYO 6I2aBt1B/WP1HMVbXPbbJAOeFLxTh5OV6GQ5paGAuAR58lwPhcWPV85sOje17F8rNoFg 7c0DmFEsMziG3+Pe5kfZcJo9mh2awCfcK5a9pumncSvCcXJw+SQ8E3oNVd2yRVZM7bUM PNzQ== MIME-Version: 1.0 X-Received: by 10.50.164.138 with SMTP id yq10mr6444318igb.29.1431701010568; Fri, 15 May 2015 07:43:30 -0700 (PDT) Received: by 10.36.156.68 with HTTP; Fri, 15 May 2015 07:43:30 -0700 (PDT) In-Reply-To: <24AB6C4D-88CF-4E10-A288-E8A2ED651909@yahoo.com> References: <621709780.322770.1431695871930.JavaMail.yahoo@mail.yahoo.com> <24AB6C4D-88CF-4E10-A288-E8A2ED651909@yahoo.com> Date: Fri, 15 May 2015 20:13:30 +0530 Message-ID: Subject: Re: Repeated Hive start-up issues From: Vikas Parashar To: "user@hive.apache.org" Content-Type: multipart/alternative; boundary=089e0129531a28db7d05161fde66 --089e0129531a28db7d05161fde66 Content-Type: text/plain; charset=UTF-8 Hi Anand, That depends on issue. You have to understand namenode logs. Sent from really tiny device :) On Friday, May 15, 2015, Anand Murali wrote: > Hi: > > Many thanks for replying. Can you please tell me how to fix namenode safe > mode issue. I am new to Hadoop. > > Thanks > > Regards > > Anand > > Sent from my iPhone > > On 15-May-2015, at 7:14 pm, Xuefu Zhang > wrote: > > Your namenode is in safe mode, as the exception shows. You need to > verify/fix that before trying Hive. > > Secondly, "!=" may not work as expected. Try "<>" or other simpler query > first. > > --Xuefu > > On Fri, May 15, 2015 at 6:17 AM, Anand Murali > wrote: > >> Hi All: >> >> I have installed Hadoop-2.6, Hive 1.1 and try to start hive and get the >> following, first time when I start the cluster >> >> $hive >> >> Logging initialized using configuration in >> jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-common-1.1.0.jar!/hive-log4j.properties >> SLF4J: Class path contains multiple SLF4J bindings. >> SLF4J: Found binding in >> [jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class] >> SLF4J: Found binding in >> [jar:file:/home/anand_vihar/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] >> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >> explanation. >> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] >> Exception in thread "main" java.lang.RuntimeException: >> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): >> Cannot create directory >> /tmp/hive/anand_vihar/a9d68b70-01b4-4d4d-9d06-1f86efc3b2bc. Name node is in >> safe mode. >> The reported blocks 2 has reached the threshold 0.9990 of total blocks 2. >> The number of live datanodes 1 has reached the minimum number 0. In safe >> mode extension. Safe mode will be turned off automatically in 13 seconds. >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4216) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4191) >> at >> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:813) >> at >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:600) >> at >> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619) >> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:415) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) >> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) >> >> at >> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472) >> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671) >> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at org.apache.hadoop.util.RunJar.run(RunJar.java:221) >> at org.apache.hadoop.util.RunJar.main(RunJar.java:136) >> Caused by: >> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): >> Cannot create directory >> /tmp/hive/anand_vihar/a9d68b70-01b4-4d4d-9d06-1f86efc3b2bc. Name node is in >> safe mode. >> The reported blocks 2 has reached the threshold 0.9990 of total blocks 2. >> The number of live datanodes 1 has reached the minimum number 0. In safe >> mode extension. Safe mode will be turned off automatically in 13 seconds. >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1364) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4216) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4191) >> at >> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:813) >> at >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:600) >> at >> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619) >> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:415) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) >> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) >> >> at org.apache.hadoop.ipc.Client.call(Client.java:1468) >> at org.apache.hadoop.ipc.Client.call(Client.java:1399) >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) >> at com.sun.proxy.$Proxy13.mkdirs(Unknown Source) >> at >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at >> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) >> at >> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) >> at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) >> at >> org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2753) >> at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2724) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866) >> at >> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866) >> at >> org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859) >> at >> org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:584) >> at >> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:526) >> at >> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:458) >> ... 8 more >> >> *Now, again if I try to start HIVE it works* >> >> >> >> >> >> >> >> >> >> *anand_vihar@Latitude-E5540:~$ hiveLogging initialized using >> configuration in >> jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-common-1.1.0.jar!/hive-log4j.propertiesSLF4J: >> Class path contains multiple SLF4J bindings.SLF4J: Found binding in >> [jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: >> Found binding in >> [jar:file:/home/anand_vihar/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: >> See http://www.slf4j.org/codes.html#multiple_bindings >> for an >> explanation.SLF4J: Actual binding is of type >> [org.slf4j.impl.Log4jLoggerFactory]* >> >> However, when I start to run SQL commands jline fails as below. >> >> select year, MAX(temperature) from records where temperature != 9999 >> group by year; >> [ERROR] Could not expand event >> java.lang.IllegalArgumentException: != 9999 group by year;: event not >> found >> at jline.console.ConsoleReader.expandEvents(ConsoleReader.java:779) >> at jline.console.ConsoleReader.finishBuffer(ConsoleReader.java:631) >> at jline.console.ConsoleReader.accept(ConsoleReader.java:2019) >> at jline.console.ConsoleReader.readLine(ConsoleReader.java:2666) >> at jline.console.ConsoleReader.readLine(ConsoleReader.java:2269) >> at >> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:748) >> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) >> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at org.apache.hadoop.util.RunJar.run(RunJar.java:221) >> at org.apache.hadoop.util.RunJar.main(RunJar.java:136) >> >> > >> >> Can somebody advise. >> >> Thanks >> >> Regards; >> >> >> >> >> >> Anand Murali >> 11/7, 'Anand Vihar', Kandasamy St, Mylapore >> Chennai - 600 004, India >> Ph: (044)- 28474593/ 43526162 (voicemail) >> > > --089e0129531a28db7d05161fde66 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi Anand,

That depends on issue. You have to understand = namenode logs.


Sent from really tin= y device :)




On Friday, May = 15, 2015, Anand Murali <anand_v= ihar@yahoo.com> wrote:
Hi:

Many thanks for replying. Can you = please tell me how to fix namenode safe mode issue. I am new to Hadoop.

Thanks

Regards
Anand

Sent from my iPhone

On 15-May-2015, = at 7:14 pm, Xuefu Zhang <xzhang@cloudera.com&= gt; wrote:

Your namenode is in safe mode, as the exception shows. You need to verify= /fix that before trying Hive.

Secondly, "!=3D" = may not work as expected. Try "<>" or other simpler query f= irst.

--Xuefu
<= br>
On Fri, May 15, 2015 at 6:17 AM, Anand Murali= <anand_vihar@yahoo.com&g= t; wrote:
Hi All:
<= br>
I have installed Hadoop-2.6, Hive 1.1 and try to = start hive and get the following, first time when I start the cluster

$hive

Logging initialized using = configuration in jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-common-1.1.= 0.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J b= indings.
SLF4J: Found binding in [jar:file:/home/anand_vihar/hive-1.1.0/= lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class= ]
SLF4J: Found binding in [jar:file:/home/anand_vihar/hadoop-2.6.0/share= /hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBind= er.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindi= ngs for an explanation.
SLF4J: Actual binding is of type [org.slf4j.= impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.= RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.h= dfs.server.namenode.SafeModeException): Cannot create directory /tmp/hive/a= nand_vihar/a9d68b70-01b4-4d4d-9d06-1f86efc3b2bc. Name node is in safe mode.=
The reported blocks 2 has reached the threshold 0.9990 of total blocks = 2. The number of live datanodes 1 has reached the minimum number 0. In safe= mode extension. Safe mode will be turned off automatically in 13 seconds.<= br>=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesyste= m.checkNameNodeSafeMode(FSNamesystem.java:1364)
=C2=A0=C2=A0=C2=A0 at or= g.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.ja= va:4216)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FS= Namesystem.mkdirs(FSNamesystem.java:4191)
=C2=A0=C2=A0=C2=A0 at org.apac= he.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.j= ava:813)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientN= amenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSi= deTranslatorPB.java:600)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.pr= otocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBloc= kingMethod(ClientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at org.= apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Protobuf= RpcEngine.java:619)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.RPC$Serv= er.call(RPC.java:962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server= $Handler$1.run(Server.java:2039)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop= .ipc.Server$Handler$1.run(Server.java:2035)
=C2=A0=C2=A0=C2=A0 at java.s= ecurity.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 = at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0= at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati= on.java:1628)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler= .run(Server.java:2033)

=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.= ql.session.SessionState.start(SessionState.java:472)
=C2=A0=C2=A0=C2=A0 = at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:61= 5)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Na= tive Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.= invoke(NativeMethodAccessorImpl.java:57)
=C2=A0=C2=A0=C2=A0 at sun.refle= ct.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43= )
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:606)=
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.util.RunJar.run(RunJar.java:221= )
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.util.RunJar.main(RunJar.java:1= 36)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.h= dfs.server.namenode.SafeModeException): Cannot create directory /tmp/hive/a= nand_vihar/a9d68b70-01b4-4d4d-9d06-1f86efc3b2bc. Name node is in safe mode.=
The reported blocks 2 has reached the threshold 0.9990 of total blocks = 2. The number of live datanodes 1 has reached the minimum number 0. In safe= mode extension. Safe mode will be turned off automatically in 13 seconds.<= br>=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesyste= m.checkNameNodeSafeMode(FSNamesystem.java:1364)
=C2=A0=C2=A0=C2=A0 at or= g.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.ja= va:4216)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FS= Namesystem.mkdirs(FSNamesystem.java:4191)
=C2=A0=C2=A0=C2=A0 at org.apac= he.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.j= ava:813)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientN= amenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSi= deTranslatorPB.java:600)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.pr= otocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBloc= kingMethod(ClientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at org.= apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Protobuf= RpcEngine.java:619)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.RPC$Serv= er.call(RPC.java:962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server= $Handler$1.run(Server.java:2039)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop= .ipc.Server$Handler$1.run(Server.java:2035)
=C2=A0=C2=A0=C2=A0 at java.s= ecurity.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 = at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0= at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformati= on.java:1628)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler= .run(Server.java:2033)

=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.C= lient.call(Client.java:1468)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc= .Client.call(Client.java:1399)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.i= pc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
=C2=A0= =C2=A0=C2=A0 at com.sun.proxy.$Proxy13.mkdirs(Unknown Source)
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTransl= atorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539)
=C2=A0=C2=A0= =C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
= =C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMet= hodAccessorImpl.java:57)
=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMet= hodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
=C2=A0=C2= =A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:606)
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod= (RetryInvocationHandler.java:187)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoo= p.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy14.mkdirs(Unknown Source)
= =C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSCl= ient.java:2753)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSClient.m= kdirs(DFSClient.java:2724)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.= DistributedFileSystem$17.doCall(DistributedFileSystem.java:870)
=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(Dist= ributedFileSystem.java:866)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.fs.F= ileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(Di= stributedFileSystem.java:866)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hd= fs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859)
=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.hive.ql.session.SessionState.createPath(S= essionState.java:584)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.se= ssion.SessionState.createSessionDirs(SessionState.java:526)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState= .java:458)
=C2=A0=C2=A0=C2=A0 ... 8 more

<= /div>
Now, again if I try to start HIVE it works

anand_vihar@Latitude-E5540= :~$ hive

Logging initialized using configuration in jar:file:/home/a= nand_vihar/hive-1.1.0/lib/hive-common-1.1.0.jar!/hive-log4j.properties
S= LF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding = in [jar:file:/home/anand_vihar/hive-1.1.0/lib/hive-jdbc-1.1.0-standalone.ja= r!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar= :file:/home/anand_vihar/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-= 1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See ht= tp://www.slf4j.org/codes.html#multiple_bindings for an explanation.
= SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

However, when I start to= run SQL commands jline fails as below.

select year, MAX(temperature) from records where temperature = !=3D 9999 group by year;
[ERROR] Could not expand event
java.lang.Ill= egalArgumentException: !=3D 9999 group by year;: event not found
=C2=A0= =C2=A0=C2=A0 at jline.console.ConsoleReader.expandEvents(ConsoleReader.java= :779)
=C2=A0=C2=A0=C2=A0 at jline.console.ConsoleReader.finishBuffer(Con= soleReader.java:631)
=C2=A0=C2=A0=C2=A0 at jline.console.ConsoleReader.a= ccept(ConsoleReader.java:2019)
=C2=A0=C2=A0=C2=A0 at jline.console.Conso= leReader.readLine(ConsoleReader.java:2666)
=C2=A0=C2=A0=C2=A0 at jline.c= onsole.ConsoleReader.readLine(ConsoleReader.java:2269)
=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:74= 8)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.cli.CliDriver.run(CliDri= ver.java:675)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.cli.CliDriver= .main(CliDriver.java:615)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethod= AccessorImpl.invoke0(Native Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.Na= tiveMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
=C2=A0= =C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingM= ethodAccessorImpl.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Metho= d.invoke(Method.java:606)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.util.R= unJar.run(RunJar.java:221)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.util.= RunJar.main(RunJar.java:136)

=C2=A0=C2=A0=C2=A0 >

Can somebody advise.

Thanks

=
Regards;




=
=C2=A0
Anand Murali=C2=A0=C2=A0
11/7,= 'Anand Vihar', Kandasamy St, Mylapore
Chennai - 600 004, India
Ph: (044)= - 28474593/=C2=A043526162 (voicemail)

--089e0129531a28db7d05161fde66--