Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 112C8E583 for ; Fri, 23 Nov 2012 05:51:30 +0000 (UTC) Received: (qmail 59736 invoked by uid 500); 23 Nov 2012 05:51:25 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 58954 invoked by uid 500); 23 Nov 2012 05:51:24 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 58924 invoked by uid 99); 23 Nov 2012 05:51:24 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 23 Nov 2012 05:51:24 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of dontariq@gmail.com designates 209.85.216.41 as permitted sender) Received: from [209.85.216.41] (HELO mail-qa0-f41.google.com) (209.85.216.41) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 23 Nov 2012 05:51:18 +0000 Received: by mail-qa0-f41.google.com with SMTP id c26so1372678qad.14 for ; Thu, 22 Nov 2012 21:50:57 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=SfRK236uibGkhbzmWva4/q1jTZJVB5KaJdJBnDlEogM=; b=ydxAPnq+P/KIoHBFWJJWLS5EWwtIlcOI2j9qWRe4LJmNSUyy02TU5WQLSFA0kcb1dI K2W1JXUHUSTokJC78Dp9QMuu40C9YzqLZVdL/Va4abj1CcJSxr4X5TPGk6tS+oykqmD8 08YroHet5cTuh9F3hZETlBT6/FcJYrPyG+kenGHRd2j5O6pCgt91P3gK4LOf4NmOQK7Q XXeEz2QGouCwFX/YSWIp/PtTyjPzG/exLrjwSvko7xJ9gUZgavadoeAFOYQMoml5Dv79 Iv8sS2QrjH6l01aFnxJI7hMNGCds1r9hDUI1nv/v6FnwqXZl7I1CMkU3s7zAisxWmx+T i0xQ== Received: by 10.49.12.138 with SMTP id y10mr2879953qeb.64.1353649857387; Thu, 22 Nov 2012 21:50:57 -0800 (PST) MIME-Version: 1.0 Received: by 10.229.126.165 with HTTP; Thu, 22 Nov 2012 21:50:17 -0800 (PST) In-Reply-To: References: From: Mohammad Tariq Date: Fri, 23 Nov 2012 11:20:17 +0530 Message-ID: Subject: Re: hadoop-1.0.4 installation error To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=047d7b6d88b2e65d9104cf232a8f X-Virus-Checked: Checked by ClamAV on apache.org --047d7b6d88b2e65d9104cf232a8f Content-Type: text/plain; charset=ISO-8859-1 Hello Yogesh, Just make sure you are operating from the correct user. Regards, Mohammad Tariq On Fri, Nov 23, 2012 at 11:13 AM, yogesh dhari wrote: > Hello, > > I am facing error while installing Apache hadoop-1.0.4. TaskTracker and > DataNode doesn't start. > > Log file shows. > > TT's Log file:: > > 2012-11-23 10:54:31,116 ERROR > org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException > as:yogesh cause:java.io.IOException: Call to localhost/127.0.0.1:8001failed on local exception: java.io.IOException: Connection reset by peer > 2012-11-23 10:54:31,117 ERROR org.apache.hadoop.mapred.TaskTracker: Can > not start task tracker because java.io.IOException: Call to localhost/ > 127.0.0.1:8001 failed on local exception: java.io.IOException: Connection > reset by peer > at org.apache.hadoop.ipc.Client.wrapException(Client.java:1107) > at org.apache.hadoop.ipc.Client.call(Client.java:1075) > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225) > at org.apache.hadoop.mapred.$Proxy5.getProtocolVersion(Unknown Source) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396) > > > > DN's Log file:: (Although files exists with file permission 755 ) > > 2012-11-23 10:50:16,753 ERROR > org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: All > specified directories are not accessible or do not exist. > at > org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:139) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:385) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:299) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1582) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1521) > at > org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539) > > > Please suggest > > Thanks & regards > Yogesh Kumar > --047d7b6d88b2e65d9104cf232a8f Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hello Yogesh,

=A0 =A0 Just make sure you are operating f= rom the correct user.

Reg= ards,
=A0=A0 =A0Mohammad Tariq



On Fri, Nov 23, 2012 at 11:13 AM, yogesh= dhari <yogeshdhari@live.com> wrote:
Hello,

I am facing error while installing Apache hadoop-1.0.4. TaskT= racker and DataNode doesn't start.

Log file shows.

TT'= ;s Log file::

2012-11-23 10:54:31,116 ERROR org.apache.hadoop.securi= ty.UserGroupInformation: PriviledgedActionException as:yogesh cause:java.io= .IOException: Call to localhost/127.0.0.1:8001 failed on local exception: java.io.IOException:= Connection reset by peer
2012-11-23 10:54:31,117 ERROR org.apache.hadoop.mapred.TaskTracker: Can not= start task tracker because java.io.IOException: Call to localhost/127.0.0.1:8001 failed on l= ocal exception: java.io.IOException: Connection reset by peer
=A0=A0=A0 at org.apache.hadoop.ipc.Client.wrapException(Client.java:1107)=A0=A0=A0 at org.apache.hadoop.ipc.Client.call(Client.java:1075)
=A0= =A0=A0 at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
=A0=A0= =A0 at org.apache.hadoop.mapred.$Proxy5.getProtocolVersion(Unknown Source)<= br> =A0=A0=A0 at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)


DN's Log file::=A0 (Although files exists with file permission 755 )<= br>
2012-11-23 10:50:16,753 ERROR org.apache.hadoop.hdfs.server.datanode= .DataNode: java.io.IOException: All specified directories are not accessibl= e or do not exist.
=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTran= sitionRead(DataStorage.java:139)
=A0=A0=A0 at org.apache.hadoop.hdfs.ser= ver.datanode.DataNode.startDataNode(DataNode.java:385)
=A0=A0=A0 at org.= apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:299)=
=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(D= ataNode.java:1582)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.D= ataNode.instantiateDataNode(DataNode.java:1521)
=A0=A0=A0 at org.apache.= hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1539)


Please suggest

Thanks & regards
Yogesh Kumar
=

--047d7b6d88b2e65d9104cf232a8f--