hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Visioner Sadak <visioner.sa...@gmail.com>
Subject Re: Exception while running a Hadoop example on a standalone install on Windows 7
Date Wed, 05 Sep 2012 06:51:28 GMT
Hadoop 1.0.3 will give you lot of problems with windows and cygwin, becoz
of complexities of cygwin configuration paths,so better downgrade to lower
versions for development and testing purpose on windows(i downgraded to
0.22.0)  and you can  use 1.0.3 on production with linux servers...I will
be attaching a tutorial for hadoop installation on windows 7 with cygwin
soon

On Tue, Sep 4, 2012 at 6:16 PM, Udayini Pendyala <udayini_pendyala@yahoo.com
> wrote:

>   Hi,
>
>
> Following is a description of what I am trying to do and the steps I
> followed.
>
>
> GOAL:
>
> a). Install Hadoop 1.0.3
>
> b). Hadoop in a standalone (or local) mode
>
> c). OS: Windows 7
>
>
> STEPS FOLLOWED:
>
> 1.    1.   I followed instructions from:
> http://www.oreillynet.com/pub/a/other-programming/excerpts/hadoop-tdg/installing-apache-hadoop.html.
> Listing the steps I did -
>
> a.       I went to: http://hadoop.apache.org/core/releases.html.
>
> b.      I installed hadoop-1.0.3 by downloading “hadoop-1.0.3.tar.gz” and
> unzipping/untarring the file.
>
> c.       I installed JDK 1.6 and set up JAVA_HOME to point to it.
>
> d.      I set up HADOOP_INSTALL to point to my Hadoop install location. I
> updated my PATH variable to have $HADOOP_INSTALL/bin
>
> e.      After the above steps, I ran the command: “hadoop version” and
> got the following information:
>
> $ hadoop version
>
> Hadoop 1.0.3
>
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1335192
>
> Compiled by hortonfo on Tue May 8 20:31:25 UTC 2012
>
> From source with checksum e6b0c1e23dcf76907c5fecb4b832f3be
>
>
>
> 2.      2.  The standalone was very easy to install as described above.
> Then, I tried to run a sample command as given in:
>
> http://hadoop.apache.org/common/docs/r0.17.2/quickstart.html#Local
>
> Specifically, the steps followed were:
>
> a.       cd $HADOOP_INSTALL
>
> b.      mkdir input
>
> c.       cp conf/*.xml input
>
> d.      bin/hadoop jar hadoop-examples-1.0.3.jar grep input output
> ‘dfs[a-z.]+’
>
> and got the following error:
>
>
>
> $ bin/hadoop jar hadoop-examples-1.0.3.jar grep input output 'dfs[a-z.]+'
>
> 12/09/03 15:41:57 WARN util.NativeCodeLoader: Unable to load native-hadoop
> libra ry for your platform... using builtin-java classes where applicable
>
> 12/09/03 15:41:57 ERROR security.UserGroupInformation:
> PriviledgedActionExceptio n as:upendyal cause:java.io.IOException: Failed
> to set permissions of path: \tmp
> \hadoop-upendyal\mapred\staging\upendyal-1075683580\.staging to 0700
>
> java.io <http://java.io.io/>.IOException: Failed to set permissions of
> path: \tmp\hadoop-upendyal\map red\staging\upendyal-1075683580\.staging to
> 0700
>
> at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)
>
> at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662)
>
> at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> tem.java:509)
>
> at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> a:344)
>
> at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18 9)
>
> at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> ssionFiles.java:116)
>
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856)
>
> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Unknown Source)
>
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> tion.java:1121)
>
> at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:8
> 50)
>
> at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
>
> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
>
> at org.apache.hadoop.examples.Grep.run(Grep.java:69)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
> at org.apache.hadoop.examples.Grep.main(Grep.java:93)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>
> at java.lang.reflect.Method.invoke(Unknown Source)
>
> at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(Progra
> mDriver.java:68)
>
> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>
> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>
> at java.lang.reflect.Method.invoke(Unknown Source)
>
> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>
>
>
> 3.    3.   I googled the problem and found the following links but none
> of these suggestions helped. Most people seem to be getting a resolution
> when they change the version of Hadoop.
>
> a.
> http://mail-archives.apache.org/mod_mbox/hadoop-common-user/201105.mbox/%3CBANLkTin-8+z8uYBTdmaa4cvxz4JzM14VfA@mail.gmail.com%3E
>
> b.
> http://comments.gmane.org/gmane.comp.jakarta.lucene.hadoop.user/25837
>
>
> Is this a problem in the version of Hadoop I selected OR am I doing
> something wrong? I would appreciate any help with this.
>
> Thanks
>
> Udayini
>

Mime
View raw message