Return-Path: X-Original-To: apmail-hive-dev-archive@www.apache.org Delivered-To: apmail-hive-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 94E49D502 for ; Sat, 27 Oct 2012 09:44:47 +0000 (UTC) Received: (qmail 58450 invoked by uid 500); 27 Oct 2012 09:44:45 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 57270 invoked by uid 500); 27 Oct 2012 09:44:43 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 56066 invoked by uid 99); 27 Oct 2012 09:44:42 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 27 Oct 2012 09:44:42 +0000 X-ASF-Spam-Status: No, hits=1.8 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of sagarnikam123@gmail.com designates 209.85.215.48 as permitted sender) Received: from [209.85.215.48] (HELO mail-la0-f48.google.com) (209.85.215.48) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 27 Oct 2012 09:44:33 +0000 Received: by mail-la0-f48.google.com with SMTP id u2so3203127lag.35 for ; Sat, 27 Oct 2012 02:44:13 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=ZJOYzvvVRim23ePL748tnBsLZTm9vUyCyFs5xfboVFE=; b=E6SVc8LBq75SDPDDX6t4uz9JGlnb16gdx2KhtrrzBiL4eBA+ezT06F9XGG5fz9WAtN YYlmcAXIgXScel4xKjk3HcojpwwPFd/ngPvsf/J/YMecD4zt+GlebUZytzI2I5uHTEnd Hht64lFn7wk7GyhhVgZeD9xNRMIUi41f4ORUro+PjKwWx98o95nHYD2cxGWP4MbPnaFQ UWPHq8wHVYIbbEmkaloL4LeHetvAtqygzl/RsqUl9KI2NXwi8KmhAWgwd56CUsob6rxt TFfVBIKXTlgShGV5rBc3WoJgqR0J4jjmUtPe3Fwvyw8zye/1CW4mxG+5+SDxmTIYMzyX CRlA== MIME-Version: 1.0 Received: by 10.112.101.72 with SMTP id fe8mr9962093lbb.107.1351331052997; Sat, 27 Oct 2012 02:44:12 -0700 (PDT) Received: by 10.114.14.36 with HTTP; Sat, 27 Oct 2012 02:44:12 -0700 (PDT) In-Reply-To: References: Date: Sat, 27 Oct 2012 15:14:12 +0530 Message-ID: Subject: Re: FAILED: Hive Internal Error From: sagar nikam To: user@hive.apache.org, dev@hive.apache.org Content-Type: multipart/alternative; boundary=14dae9d68216634d2304cd074751 X-Virus-Checked: Checked by ClamAV on apache.org --14dae9d68216634d2304cd074751 Content-Type: text/plain; charset=ISO-8859-1 Respected Sir/Madam, > > > I have installed Hadoop on my ubuntu 12.04 system > i installed HIVE also.for some days it works fine..but one day i directly > shutdown my machine (with closing hive & hadoop) > now i am running some query..it throws error (query like "show > databases","use database_name" works fine) but for below > > hive> select count(*) from cidade; > Error thrown:- > > FAILED: Hive Internal Error: > java.lang.RuntimeException(java.net.ConnectException: Call to localhost/ > 127.0.0.1:54310 failed on connection exception: > java.net.ConnectException: Connection refused) > java.lang.RuntimeException: java.net.ConnectException: Call to localhost/ > 127.0.0.1:54310 failed on connection exception: > java.net.ConnectException: Connection refused > at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:151) > at org.apache.hadoop.hive.ql.Context.getMRScratchDir(Context.java:190) > at org.apache.hadoop.hive.ql.Context.getMRTmpFileURI(Context.java:247) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:900) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:6594) > at > org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310failed on connection exception: java.net.ConnectException: Connection > refused > at org.apache.hadoop.ipc.Client.wrapException(Client.java:767) > at org.apache.hadoop.ipc.Client.call(Client.java:743) > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) > at $Proxy4.getProtocolVersion(Unknown Source) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359) > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106) > at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:207) > at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170) > at > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82) > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196) > at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175) > at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:145) > ... 15 more > Caused by: java.net.ConnectException: Connection refused > at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) > at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567) > at > org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) > at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404) > at > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304) > at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176) > at org.apache.hadoop.ipc.Client.getConnection(Client.java:860) > at org.apache.hadoop.ipc.Client.call(Client.java:720) > ... 28 more > > ========================================================================================================================= > > is it that ,some files may damage during shutdown ? > what could be the error ? > --14dae9d68216634d2304cd074751--