Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5B50411371 for ; Thu, 10 Jul 2014 01:37:03 +0000 (UTC) Received: (qmail 52200 invoked by uid 500); 10 Jul 2014 01:36:58 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 52061 invoked by uid 500); 10 Jul 2014 01:36:58 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Delivered-To: moderator for user@hadoop.apache.org Received: (qmail 82273 invoked by uid 99); 9 Jul 2014 09:37:11 -0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of deeppradhan@gmail.com designates 209.85.213.182 as permitted sender) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=I6BOKCY93vaK+kEpGWinjpdJAOE+nVLVMqIxLeqSFjA=; b=UsCIWKKKk4XfVE0CGDY2AVhcd2nnVTXZVqsJ0g4IVgCKno2KbxhQ8KCXUsUahhGYpx C8GXC46ChfCnzlgGaQCGbHdvw8LkXcmXxMAo3hvIYAZTSC7u6tk9GS7JyuXyZJB0rfxR 4oPTTTUGhwRaon6EAs6ebk5Zb6QO+C1GRbNYybphHH2iXal61pq7qYPUvPt9brp/zSu0 lDBapMt/x9igdaglqThKChoGHZWat7sdsy0IVLfiMJlWI++OdMT10WoytEgAWo0w56Nr CWzv481PKsDjpc9R72SD4KdZCDRjrEhZ3dxNA4M6YkTKaZ/S9kYfLEVEaR3a6asdXN2A 4zSA== MIME-Version: 1.0 X-Received: by 10.42.80.208 with SMTP id w16mr1825463ick.83.1404898605495; Wed, 09 Jul 2014 02:36:45 -0700 (PDT) Date: Wed, 9 Jul 2014 15:06:45 +0530 Message-ID: Subject: Pegasus From: Deep Pradhan To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=20cf3003bbca53797604fdbf72bf X-Virus-Checked: Checked by ClamAV on apache.org --20cf3003bbca53797604fdbf72bf Content-Type: text/plain; charset=UTF-8 Hi, I am using Pegasus. Can someone help me with this error? When I run the command "list" in the UI, after giving a "demo" command (demo adds the graph catstar, but I get error afterwards), I get the following PEGASUS> list === GRAPH LIST === 14/07/09 14:45:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable catstar But, when the "demo" command is given I get the following error (I have done "demo" even before giving "list" command): PEGASUS> demo 14/07/09 14:45:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/07/09 14:46:35 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/ 192.168.1.105:50010] at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:532) at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1341) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1167) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1122) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:522) 14/07/09 14:46:35 INFO hdfs.DFSClient: Abandoning BP-305979739-192.168.1.10-1403675139790:blk_1073741835_1011 14/07/09 14:46:35 INFO hdfs.DFSClient: Excluding datanode 192.168.1.105:50010 14/07/09 14:46:35 WARN hdfs.DFSClient: DataStreamer Exception org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/deep/pegasus/graphs/catstar/edge/catepillar_star.edge._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1406) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2596) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:563) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956) at org.apache.hadoop.ipc.Client.call(Client.java:1406) at org.apache.hadoop.ipc.Client.call(Client.java:1359) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:348) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1264) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1112) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:522) put: File /user/deep/pegasus/graphs/catstar/edge/catepillar_star.edge._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. Graph catstar added. DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it. rmr: DEPRECATED: Please use 'rm -r' instead. 14/07/09 14:46:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable rmr: `dd_node_deg': No such file or directory DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it. rmr: DEPRECATED: Please use 'rm -r' instead. 14/07/09 14:46:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable rmr: `dd_deg_count': No such file or directory -----===[PEGASUS: A Peta-Scale Graph Mining System]===----- [PEGASUS] Computing degree distribution. Degree type = InOut 14/07/09 14:46:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/07/09 14:46:38 INFO client.RMProxy: Connecting to ResourceManager at deep-Lenovo-IdeaPad-Y510P/10.0.1.89:8032 14/07/09 14:46:38 INFO client.RMProxy: Connecting to ResourceManager at deep-Lenovo-IdeaPad-Y510P/10.0.1.89:8032 14/07/09 14:47:39 INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/ 192.168.1.105:50010] at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:532) at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1341) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1167) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1122) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:522) 14/07/09 14:47:39 INFO hdfs.DFSClient: Abandoning BP-305979739-192.168.1.10-1403675139790:blk_1073741836_1012 14/07/09 14:47:39 INFO hdfs.DFSClient: Excluding datanode 192.168.1.105:50010 14/07/09 14:47:39 WARN hdfs.DFSClient: DataStreamer Exception org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1406) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2596) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:563) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956) at org.apache.hadoop.ipc.Client.call(Client.java:1406) at org.apache.hadoop.ipc.Client.call(Client.java:1359) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:348) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1264) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1112) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:522) 14/07/09 14:47:39 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004 14/07/09 14:47:39 WARN security.UserGroupInformation: PriviledgedActionException as:deep (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1406) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2596) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:563) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956) 14/07/09 14:47:39 WARN security.UserGroupInformation: PriviledgedActionException as:deep (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1406) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2596) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:563) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956) Exception in thread "main" org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1406) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2596) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:563) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956) at org.apache.hadoop.ipc.Client.call(Client.java:1406) at org.apache.hadoop.ipc.Client.call(Client.java:1359) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy9.addBlock(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:348) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1264) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1112) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:522) rmr: DEPRECATED: Please use 'rm -r' instead. 14/07/09 14:47:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable rmr: `pegasus/graphs/catstar/results/deg/inout/*': No such file or directory 14/07/09 14:47:48 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable mv: `dd_node_deg': No such file or directory 14/07/09 14:47:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable mv: `dd_deg_count': No such file or directory 14/07/09 14:47:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable cat: `pegasus/graphs/catstar/results/deg/inout/dd_deg_count/*': No such file or directory Error: can't mine inout degree of the graph catstar. Check whether the inout degree is computed, or gnuplot is installed correctly. ***GNUplot, ant, python and java have been installed correctly -- *Whether you think you can or you cannot.....either way you are right....* With Regards... Deep --20cf3003bbca53797604fdbf72bf Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable

Hi, I am = using Pegasus. Can someone help me with this error?
When I run the command "list" in the UI, after giving a "demo" command (demo adds the graph catstar, but I get error afterwards), I get the following
PEGASUS> list
=3D=3D=3D GRAPH LIST =3D=3D=3D

14/07/09 14:45:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable catstar

But, when the "demo" command is given I get the following error (= I have done "demo" even before giving "list" command): PEGASUS> demo
14/07/09 14:45:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/07/09 14:46:35 INFO hdfs.DFSClient: Exception in createBlockOutputStream
org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=3D/192.168.1.105:50010] =C2=A0=C2=A0=C2=A0 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:532)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStr= eam.java:1341)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream= (DFSOutputStream.java:1167)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(D= FSOutputStream.java:1122)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.jav= a:522)
14/07/09 14:46:35 INFO hdfs.DFSClient: Abandoning BP-305979739-192.168.1.10-1403675139790:blk_1073741835_1011
14/07/09 14:46:35 INFO hdfs.DFSClient: Excluding datanode 192.168.1.105:50010
14/07/09 14:46:35 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/deep/pegasus/graphs/catstar/edge/catepillar_star.edge._COPYING_ could only be replicated to 0 nodes instead of minReplication (=3D1).= =C2=A0 There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(Blo= ckManager.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNa= mesystem.java:2596)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeR= pcServer.java:563)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.= ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtoco= lServerSideTranslatorPB.java:407)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNa= menodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Prot= obufRpcEngine.java:585)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handl= er$1.run(Server.java:1958)
=C2=A0=C2=A0=C2=A0 at java.security.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j= ava:1548)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956)

=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1359)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.ja= va:206)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:5= 7)
=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccess= orImpl.invoke(DelegatingMethodAccessorImpl.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:606)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocat= ionHandler.java:186)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.io.retry.RetryIn= vocationHandler.invoke(RetryInvocationHandler.java:102)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlo= ck(ClientNamenodeProtocolTranslatorPB.java:348)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DF= SOutputStream.java:1264)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(D= FSOutputStream.java:1112)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputSt= ream$DataStreamer.run(DFSOutputStream.java:522)
put: File /user/deep/pegasus/graphs/catstar/edge/catepillar_star.edge._COPY= ING_ could only be replicated to 0 nodes instead of minReplication (=3D1).=C2=A0 There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
Graph catstar added.
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

rmr: DEPRECATED: Please use 'rm -r' instead.
14/07/09 14:46:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable rmr: `dd_node_deg': No such file or directory
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

rmr: DEPRECATED: Please use 'rm -r' instead.
14/07/09 14:46:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable rmr: `dd_deg_count': No such file or directory

-----=3D=3D=3D[PEGASUS: A Peta-Scale Graph Mining System]=3D=3D=3D-----

[PEGASUS] Computing degree distribution. Degree type =3D InOut

14/07/09 14:46:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/07/09 14:46:38 INFO client.RMProxy: Connecting to ResourceManager at deep-Lenovo-IdeaPad-Y510P/10.0.1.89:8032<= /a>
14/07/09 14:46:38 INFO client.RMProxy: Connecting to ResourceManager at deep-Lenovo-IdeaPad-Y510P/
10.0.1.89:8032<= /a>
14/07/09 14:47:39 INFO hdfs.DFSClient: Exception in createBlockOutputStream
org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=3D/
192.168.1.105:50010] =C2=A0=C2=A0=C2=A0 at org.apache.hadoop.net.NetUtils.con= nect(NetUtils.java:532)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStr= eam.java:1341)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream= (DFSOutputStream.java:1167)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(D= FSOutputStream.java:1122)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.jav= a:522)
14/07/09 14:47:39 INFO hdfs.DFSClient: Abandoning BP-305979739-192.168.1.10= -1403675139790:blk_1073741836_1012
14/07/09 14:47:39 INFO hdfs.DFSClient: Excluding datanode 192.168.1.105:50010
14/07/09 14:47:39 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=3D1).= =C2=A0 There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.bloc= kmanagement.BlockManager.chooseTarget(BlockManager.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNa= mesystem.java:2596)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeR= pcServer.java:563)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslato= rPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNa= menodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Prot= obufRpcEngine.java:585)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handl= er$1.run(Server.java:1962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958)
=C2=A0=C2=A0=C2=A0 at java.security.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGro= upInformation.doAs(UserGroupInformation.java:1548)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956)

=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1359)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.ja= va:206)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:5= 7)
=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp= l.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:606)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocat= ionHandler.java:186)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHan= dler.java:102)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlo= ck(ClientNamenodeProtocolTranslatorPB.java:348)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DF= SOutputStream.java:1264)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(D= FSOutputStream.java:1112)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.jav= a:522)
14/07/09 14:47:39 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004
14/07/09 14:47:39 WARN security.UserGroupInformation: PriviledgedActionException as:deep (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=3D1).= =C2=A0 There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(Blo= ckManager.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.name= node.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2596)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeR= pcServer.java:563)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslato= rPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNa= menodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcE= ngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958)
=C2=A0=C2=A0=C2=A0 at java.security.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j= ava:1548)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handl= er.run(Server.java:1956)

14/07/09 14:47:39 WARN security.UserGroupInformation: PriviledgedActionException as:deep (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=3D1).= =C2=A0 There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(Blo= ckManager.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNa= mesystem.java:2596)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeR= pcServer.java:563)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.= ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtoco= lServerSideTranslatorPB.java:407)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNa= menodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Prot= obufRpcEngine.java:585)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handl= er$1.run(Server.java:1958)
=C2=A0=C2=A0=C2=A0 at java.security.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j= ava:1548)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956)

Exception in thread "main" org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hadoop-yarn/staging/deep/.staging/job_1404881282667_0004/job.jar could only be replicated to 0 nodes instead of minReplication (=3D1).= =C2=A0 There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(Blo= ckManager.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.name= node.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2596)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeR= pcServer.java:563)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslato= rPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNa= menodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcE= ngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958)
=C2=A0=C2=A0=C2=A0 at java.security.AccessController.doPrivileged(Native Method)
=C2=A0=C2=A0=C2=A0 at javax.security.auth.Subject.doAs(Subject.java:415)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j= ava:1548)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Server$Handl= er.run(Server.java:1956)

=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1406)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1359)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.ja= va:206)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:5= 7)
=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp= l.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:606)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocat= ionHandler.java:186)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHan= dler.java:102)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlo= ck(ClientNamenodeProtocolTranslatorPB.java:348)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputSt= ream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1264)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(D= FSOutputStream.java:1112)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.jav= a:522)
rmr: DEPRECATED: Please use 'rm -r' instead.
14/07/09 14:47:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable rmr: `pegasus/graphs/catstar/results/deg/inout/*': No such file or directory
14/07/09 14:47:48 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable mv: `dd_node_deg': No such file or directory
14/07/09 14:47:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable mv: `dd_deg_count': No such file or directory
14/07/09 14:47:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable cat: `pegasus/graphs/catstar/results/deg/inout/dd_deg_count/*': No such= file or directory
Error: can't mine inout degree of the graph catstar. Check whether the = inout degree is computed, or gnuplot is installed correctly.

***GNUplot, ant, = python and java have been installed correctly

<= span style=3D"font-family:"Courier New""> =C2=A0


--
Whether you think you can or you cannot.....either way you a= re right....
With Regards...
Deep

--20cf3003bbca53797604fdbf72bf--