Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6D8D6114BE for ; Fri, 10 May 2013 10:58:35 +0000 (UTC) Received: (qmail 25153 invoked by uid 500); 10 May 2013 10:58:30 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 24699 invoked by uid 500); 10 May 2013 10:58:27 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 24668 invoked by uid 99); 10 May 2013 10:58:26 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 10 May 2013 10:58:26 +0000 X-ASF-Spam-Status: No, hits=1.8 required=5.0 tests=HTML_FONT_FACE_BAD,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of geelongyao@gmail.com designates 209.85.128.171 as permitted sender) Received: from [209.85.128.171] (HELO mail-ve0-f171.google.com) (209.85.128.171) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 10 May 2013 10:58:20 +0000 Received: by mail-ve0-f171.google.com with SMTP id m1so589980ves.30 for ; Fri, 10 May 2013 03:57:59 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=m30AVyqS1pyOQblu4Q/JKirppoPL2ajo5oo8pF97ebo=; b=ug1d0BjNM5lXA53lkgSkrjTzkFB6J/m7hQXx3wqDaCvoGHZ6LLCfwSsNWIsOcyyctW WRa0FB6AyCUpkfBL/DFhkoNfjA3p2Uzt9yIfLFPY8kLWyNAdjnJYAuwYCUNKn6AazcSu cgPxOlWn8poJ+4/2/Fm9ibHuaQ1tKjzeDIQYqMrP8zwSu//nx/gAs6r9hi8oAu/xwRrW XVakwgjWEyrvT/NAVO3glnS8BLBWdbG1uqCDocZcqth8C2hRTWyaH0jinGMqUgQids4q WVh73WQAcNowGPxi7BtWJV1f2cx5qlSjem/wh03R8g175qAxuruCyUUDOpfSr2vHdenj ra6w== X-Received: by 10.220.82.68 with SMTP id a4mr10510647vcl.49.1368183479822; Fri, 10 May 2013 03:57:59 -0700 (PDT) MIME-Version: 1.0 Received: by 10.59.5.98 with HTTP; Fri, 10 May 2013 03:57:38 -0700 (PDT) In-Reply-To: References: From: Geelong Yao Date: Fri, 10 May 2013 18:57:38 +0800 Message-ID: Subject: Re: Issue with running my first hadoop program using eclipse To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c1e23a4d6ffe04dc5b0ac1 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c1e23a4d6ffe04dc5b0ac1 Content-Type: text/plain; charset=ISO-8859-1 I think the main problem maybe the permission of your tmp directory. 2013/5/10 Ramya S > Hi, > I am new to hadoop,so please help me to tackle the following issue. > > I have Installed Apache Hadoop Pseudo Distributed Mode on a Single Node > with hadoop-1.0.4.It works fine and i tried some examples of wordcount on > putty too. But getting into eclipse IDE on windows i was struck up with the > first program. I have already installed jdk6 and mapreduce plugin > version1.0.4 also. When trying to run on hadoop error occurs as follows:- > > > > - WARN util.NativeCodeLoader: Unable to load native-hadoop library for > your platform... using builtin-java classes where applicable > - ERROR security.UserGroupInformation: > > *PriviledgedActionException* as:ramyas *cause:java.io.IOException*: > Failed to set permissions of path: > \tmp\hadoop-ramyas\mapred\staging\ramyas-1375395355\.staging to 0700 > * > java.io.IOException > *: Failed to set permissions of path: > \tmp\hadoop-ramyas\mapred\staging\ramyas-1375395355\.staging to 0700 > at org.apache.hadoop.fs.FileUtil.checkReturnValue( > *FileUtil.java:689*) > at org.apache.hadoop.fs.FileUtil.setPermission( > *FileUtil.java:662*) > at org.apache.hadoop.fs.RawLocalFileSystem.setPermission( > *RawLocalFileSystem.java:509*) > at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs( > *RawLocalFileSystem.java:344*) > at org.apache.hadoop.fs.FilterFileSystem.mkdirs( > *FilterFileSystem.java:189*) > at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir( > *JobSubmissionFiles.java:116*) > at org.apache.hadoop.mapred.JobClient$2.run( > *JobClient.java:856*) > at org.apache.hadoop.mapred.JobClient$2.run( > *JobClient.java:850*) > at java.security.AccessController.doPrivileged( > *Native Method*) > at javax.security.auth.Subject.doAs(Unknown Source) > at org.apache.hadoop.security.UserGroupInformation.doAs( > *UserGroupInformation.java:1121*) > at org.apache.hadoop.mapred.JobClient.submitJobInternal( > *JobClient.java:850*) > at org.apache.hadoop.mapred.JobClient.submitJob( > *JobClient.java:824*) > at org.apache.hadoop.mapred.JobClient.runJob( > *JobClient.java:1261*) > at WordCount.main( > *WordCount.java:29*) > > Also I want to know about configuring map/reduce plugin in eclipse. > > Thanks and Regards > > Ramya.S > -- >From Good To Great --001a11c1e23a4d6ffe04dc5b0ac1 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
I think the main problem maybe the permission of your tmp = directory.


2013/5/10 Ramya S <ramyas@suntecgroup.com>
Hi,
I am new to hadoop,so please help me to tackle th= e=A0following issue.
=A0
I have Installed Apache Hadoop Pseudo Distributed= Mode on a Single Node with hadoop-1.0.4.It works fine and i tried some examples of wordcount= on putty too. But getting into eclipse IDE=A0on windows i was struck up wi= th the first program. I have already installed jdk6 and mapreduce plugin ve= rsion1.0.4 also. When trying to run on hadoop error occurs as follows:-
=A0
=A0
  • WARN util.NativeCodeLoader: Unable to load native-hadoo= p library for your platform... using builtin-java classes where applicable<= /div>
  • ERROR security.UserGroupInformation:
PriviledgedA= ctionException as:ramyas cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-ramy= as\mapred\staging\ramyas-1375395355\.staging to 0700
  • java.io.IOException
  • : Failed to set permissions of pat= h: \tmp\hadoop-ramyas\mapred\staging\ramyas-1375395355\.staging to 0700
  • at org.apache.hadoop.fs.FileUtil.checkReturnValue(
  • File= Util.java:689)
  • at org.apache.hadoop.fs.FileUtil.setPermission(
    FileUti= l.java:662)
  • at org.apache.hadoop.fs.RawLocalFileSystem.setPermissio= n(
  • RawLocalFileSystem.java:509)
  • at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(
  • RawL= ocalFileSystem.java:344)
  • at org.apache.hadoop.fs.FilterFileSystem.mkdirs(
    <= /li>
    Filter= FileSystem.java:189)
  • at org.apache.hadoop.mapreduce.JobSubmissionFiles.getSt= agingDir(
  • JobSubmissionFiles.java:116)
  • at org.apache.hadoop.mapred.JobClient$2.run(
  • =
    JobClient.= java:856)
  • at org.apache.hadoop.mapred.JobClient$2.run(
  • =
    JobClient.= java:850)
  • at java.security.AccessController.doPrivileged(
    Native = Method)
  • at javax.security.auth.Subject.doAs(Unknown Source)
  • at org.apache.hadoop.security.UserGroupInformation.doAs= (
  • UserGroupInformation.java:1121<= font color=3D"#ff0000">)
  • at org.apache.hadoop.mapred.JobClient.submitJobInternal= (
  • JobClient.java:850)
  • at org.apache.hadoop.mapred.JobClient.submitJob(
    <= /li>
    JobCli= ent.java:824)
  • at org.apache.hadoop.mapred.JobClient.runJob(
    JobClient= .java:1261)
  • at WordCount.main(
  • WordCount.java:29<= font color=3D"#ff0000">)
  • Also I want to=A0know=A0about=A0configuring map/reduce plugin=A0in eclip= se.

    Thanks and Regards

    Ramya.S



    <= br>
    --
    From Good To Great
    --001a11c1e23a4d6ffe04dc5b0ac1--