hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ajit Kumar Shreevastava <Ajit.Shreevast...@hcl.com>
Subject RE: FAILED: Hive Internal Error: java.lang.RuntimeException(java.net.ConnectException
Date Mon, 29 Oct 2012 06:10:39 GMT
Hi Sagar,

MapReduce is not working.
Please check the corresponding xml file and try jps command to see whether  it started or
not. You can also check the log for task and job tracker.

Regards
Ajit

From: sagar nikam [mailto:sagarnikam123@gmail.com]
Sent: Monday, October 29, 2012 11:02 AM
To: user@hive.apache.org
Subject: FAILED: Hive Internal Error: java.lang.RuntimeException(java.net.ConnectException

hive> show databases;
OK
default
mm
mm2
xyz
Time taken: 6.058 seconds
hive> use mm2;
OK
Time taken: 0.039 seconds
hive> show tables;
OK
cidade
concessionaria
familia
modelo
venda
Time taken: 0.354 seconds
hive> select count(*) from familia;

FAILED: Hive Internal Error: java.lang.RuntimeException(java.net.ConnectException: Call to
localhost/127.0.0.1:54310<http://127.0.0.1:54310> failed on connection exception: java.net.ConnectException:
Connection refused)
java.lang.RuntimeException: java.net.ConnectException: Call to localhost/127.0.0.1:54310<http://127.0.0.1:54310>
failed on connection exception: java.net.ConnectException: Connection refused
            at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:151)
            at org.apache.hadoop.hive.ql.Context.getMRScratchDir(Context.java:190)
            at org.apache.hadoop.hive.ql.Context.getMRTmpFileURI(Context.java:247)
            at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:900)
            at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:6594)
            at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238)
            at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340)
            at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736)
            at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
            at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
            at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
            at java.lang.reflect.Method.invoke(Method.java:597)
            at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:54310<http://127.0.0.1:54310>
failed on connection exception: java.net.ConnectException: Connection refused
            at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
            at org.apache.hadoop.ipc.Client.call(Client.java:743)
            at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
            at $Proxy4.getProtocolVersion(Unknown Source)
            at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
            at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
            at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
            at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
            at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
            at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
            at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
            at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
            at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
            at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
            at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:145)
            ... 15 more
Caused by: java.net.ConnectException: Connection refused
            at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
            at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
            at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
            at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
            at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
            at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
            at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
            at org.apache.hadoop.ipc.Client.call(Client.java:720)
            ... 28 more
=========================================================================================================================
afterthat,i did this also

shell $> jps
3630 TaskTracker
3403 JobTracker
3086 DataNode
3678 Jps
3329 SecondaryNameNode
==========================================================================================================================
JT - job tracker web interface. is running well through address http://localhost:50030/jobtracker.jsp
in browser & showing

localhost Hadoop Map/Reduce Administration

State: INITIALIZING
Started: Sat Oct 27 17:41:34 IST 2012
Version: 0.20.2, r911707
Compiled: Fri Feb 19 08:07:34 UTC 2010 by chrisdo
Identifier: 201210271741




I tried to format name node by below command but showing error

shell>:~/Hadoop/hadoop-0.20.2/bin$ ./hadoop dfs namenode -format
12/10/27 17:45:06 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 0 time(s).
12/10/27 17:45:07 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 1 time(s).
12/10/27 17:45:08 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 2 time(s).
12/10/27 17:45:09 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 3 time(s).
12/10/27 17:45:10 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 4 time(s).
12/10/27 17:45:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 5 time(s).
12/10/27 17:45:12 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 6 time(s).
12/10/27 17:45:13 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 7 time(s).
12/10/27 17:45:14 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 8 time(s).
12/10/27 17:45:15 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310<http://127.0.0.1:54310>.
Already tried 9 time(s).
Bad connection to FS. command aborted.
========================================================================================================================

port info:-->

shell>:~/Hadoop/hadoop-0.20.2/conf$ netstat -tulpn
(Not all processes could be identified, non-owned process info
 will not be shown, you would have to be root to see it all.)
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program
name
tcp        0      0 127.0.0.1:3306<http://127.0.0.1:3306>          0.0.0.0:*       
       LISTEN      -
tcp        0      0 0.0.0.0:50060<http://0.0.0.0:50060>           0.0.0.0:*        
      LISTEN      4328/java
tcp        0      0 0.0.0.0:50030<http://0.0.0.0:50030>           0.0.0.0:*        
      LISTEN      4101/java
tcp        0      0 127.0.0.1:45298<http://127.0.0.1:45298>         0.0.0.0:*      
        LISTEN      4328/java
tcp        0      0 0.0.0.0:48946<http://0.0.0.0:48946>           0.0.0.0:*        
      LISTEN      3784/java
tcp        0      0 0.0.0.0:54771<http://0.0.0.0:54771>           0.0.0.0:*        
      LISTEN      4027/java
tcp        0      0 127.0.0.1:53<http://127.0.0.1:53>            0.0.0.0:*         
     LISTEN      -
tcp        0      0 0.0.0.0:22<http://0.0.0.0:22>              0.0.0.0:*           
   LISTEN      -
tcp        0      0 127.0.0.1:631<http://127.0.0.1:631>           0.0.0.0:*        
      LISTEN      -
tcp        0      0 0.0.0.0:51194<http://0.0.0.0:51194>           0.0.0.0:*        
      LISTEN      4101/java
tcp        0      0 0.0.0.0:8006<http://0.0.0.0:8006>            0.0.0.0:*         
     LISTEN      -
tcp        0      0 127.0.0.1:54311<http://127.0.0.1:54311>         0.0.0.0:*      
        LISTEN      4101/java
tcp        0      0 0.0.0.0:8007<http://0.0.0.0:8007>            0.0.0.0:*         
     LISTEN      -
tcp6       0      0 :::22                   :::*                    LISTEN      -
tcp6       0      0 ::1:631                 :::*                    LISTEN      -
udp        0      0 0.0.0.0:52059<http://0.0.0.0:52059>           0.0.0.0:*        
                  -
udp        0      0 127.0.0.1:53<http://127.0.0.1:53>            0.0.0.0:*         
                 -
udp        0      0 0.0.0.0:68<http://0.0.0.0:68>              0.0.0.0:*           
               -
udp        0      0 0.0.0.0:5353<http://0.0.0.0:5353>            0.0.0.0:*         
                 -
udp6       0      0 :::50206                :::*                                -
udp6       0      0 :::5353                 :::*                                -
=========================================================================================

if i am doing

shell>lsof -i tcp:54310  or
shell>netstat | grep 54310

nothing is shown-   means no one is using 54310 port
====================================================================================================
i have attached core-site.xml file also...


::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named
recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted,
corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e
mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator
or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may
not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying,
disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized
representative of
HCL is strictly prohibited. If you have received this email in error please delete it and
notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.

----------------------------------------------------------------------------------------------------------------------------------------------------

Mime
View raw message