hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "chaitanya krishna" <chaitanyavv.ii...@gmail.com>
Subject Re: Compiling Word Count in C++ : Hadoop Pipes
Date Sat, 12 Jul 2008 06:13:55 GMT
Thanks a lot for the reply!

I'll try to sort out the permission issues. Hopefully, it should work then.

On Fri, Jul 11, 2008 at 10:04 PM, Sandy <snickerdoodle08@gmail.com> wrote:

> hadoop-0.17.0 should work. I took a closer look at your error message. It
> seems you need to change permission on some of your files
>
> Try:
>
>  chmod 755 /home/jobs/hadoop-0.17.0/src/examples/pipes/configure
>
>
> At this point you probably will get another "build failed" message, because
> you need to do the same thing on another file (I don't remember it off the
> top of my head). But you can find it out by inspecting this part of the
> error message:
>
> BUILD FAILED
> /home/jobs/hadoop-0.17.0/build.xml:1040: Execute failed:
> java.io.IOException: Cannot run program
> "/home/jobs/hadoop-0.17.0/src/examples/pipes/configure" (in directory
> "/home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/examples/pipes"):
> java.io.IOException: error=13, Permission denied
>
> This means that the file that you can't run is:
> "/home/jobs/hadoop-0.17.0/src/examples/pipes/configure"
>
> due to permission issues. a chmod 755 will fix this. you'll need to do this
> with any "permission denied" message that you get associated with this.
>
> Hope this helps!
>
> -SM
> On Thu, Jul 10, 2008 at 10:03 PM, chaitanya krishna <
> chaitanyavv.iiith@gmail.com> wrote:
>
> > I'm using hadoop-0.17.0. Should I be using a more latest version?
> > Please tell me which version did you use?
> >
> > On Fri, Jul 11, 2008 at 2:35 AM, Sandy <snickerdoodle08@gmail.com>
> wrote:
> >
> > > One last thing:
> > >
> > > If that doesn't work, try following the instructions on the ubuntu
> > setting
> > > up hadoop tutorial. Even if you aren't running ubuntu, I think it may
> be
> > > possible to use those instructions to set up things properly. That's
> what
> > I
> > > eventually did.
> > >
> > > Link is here:
> > >
> > >
> >
> http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster)<http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29>
> <
> http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29
> >
> > <
> >
> http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29
> > >
> > >
> > > -SM
> > >
> > > On Thu, Jul 10, 2008 at 4:02 PM, Sandy <snickerdoodle08@gmail.com>
> > wrote:
> > >
> > > > So, I had run into a similar issue. What version of Hadoop are you
> > using?
> > > >
> > > > Make sure you are using the latest version of hadoop. That actually
> > fixed
> > > > it for me. There was something wrong with the build.xml file in
> earlier
> > > > versions that prevented me from being able to get it to work
> properly.
> > > Once
> > > > I upgraded to the latest, it went away.
> > > >
> > > > Hope this helps!
> > > >
> > > > -SM
> > > >
> > > >
> > > > On Thu, Jul 10, 2008 at 1:39 PM, chaitanya krishna <
> > > > chaitanyavv.iiith@gmail.com> wrote:
> > > >
> > > >> Hi,
> > > >>
> > > >>  I faced the similar problem as Sandy. But this time I even had the
> > jdk
> > > >> set
> > > >> properly.
> > > >>
> > > >> when i executed:
> > > >> ant -Dcompile.c++=yes examples
> > > >>
> > > >> the following was displayed:
> > > >>
> > > >> Buildfile: build.xml
> > > >>
> > > >> clover.setup:
> > > >>
> > > >> clover.info:
> > > >>     [echo]
> > > >>     [echo]      Clover not found. Code coverage reports disabled.
> > > >>     [echo]
> > > >>
> > > >> clover:
> > > >>
> > > >> init:
> > > >>     [touch] Creating /tmp/null358480626
> > > >>   [delete] Deleting: /tmp/null358480626
> > > >>      [exec] svn: '.' is not a working copy
> > > >>     [exec] svn: '.' is not a working copy
> > > >>
> > > >> record-parser:
> > > >>
> > > >> compile-rcc-compiler:
> > > >>
> > > >> compile-core-classes:
> > > >>    [javac] Compiling 2 source files to
> > > >> /home/jobs/hadoop-0.17.0/build/classes
> > > >>
> > > >> compile-core-native:
> > > >>
> > > >> check-c++-makefiles:
> > > >>
> > > >> create-c++-pipes-makefile:
> > > >>
> > > >> BUILD FAILED
> > > >> /home/jobs/hadoop-0.17.0/build.xml:1017: Execute failed:
> > > >> java.io.IOException: Cannot run program
> > > >> "/home/jobs/hadoop-0.17.0/src/c++/pipes/configure" (in directory
> > > >> "/home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/pipes"):
> > > >> java.io.IOException: error=13, Permission denied
> > > >>
> > > >>
> > > >>
> > > >> when,as suggested by Lohith, following was executed,
> > > >>
> > > >> ant -Dcompile.c++=yes compile-c++-examples
> > > >>
> > > >> the following was displayed
> > > >> Buildfile: build.xml
> > > >>
> > > >> init:
> > > >>    [touch] Creating /tmp/null1037468845
> > > >>   [delete] Deleting: /tmp/null1037468845
> > > >>      [exec] svn: '.' is not a working copy
> > > >>     [exec] svn: '.' is not a working copy
> > > >>
> > > >> check-c++-makefiles:
> > > >>
> > > >> create-c++-examples-pipes-makefile:
> > > >>    [mkdir] Created dir:
> > > >>
> /home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/examples/pipes
> > > >>
> > > >> BUILD FAILED
> > > >> /home/jobs/hadoop-0.17.0/build.xml:1040: Execute failed:
> > > >> java.io.IOException: Cannot run program
> > > >> "/home/jobs/hadoop-0.17.0/src/examples/pipes/configure" (in
> directory
> > > >>
> > >
> "/home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/examples/pipes"):
> > > >> java.io.IOException: error=13, Permission denied
> > > >>
> > > >> Total time: 0 seconds
> > > >>
> > > >>
> > > >> Please help me out with this problem.
> > > >>
> > > >> Thank you.
> > > >>
> > > >> V.V.Chaitanya Krishna
> > > >>
> > > >>
> > > >> On Thu, Jun 26, 2008 at 9:49 PM, Sandy <snickerdoodle08@gmail.com>
> > > wrote:
> > > >>
> > > >> > Thanks for the suggestion! That fixed it! My .bash_profile now
> looks
> > > >> like:
> > > >> >
> > > >> > PATH=$PATH:$HOME/bin:/usr/java/jdk1.6.0_06
> > > >> > JAVA_HOME=/usr/java/jdk1.6.0_06; export JAVA_HOME
> > > >> >
> > > >> > Thanks so much for the help!
> > > >> >
> > > >> >
> > > >> > I do have a couple more questions about getting the word count
> > example
> > > >> to
> > > >> > work... If I copy and paste the code into a file called 'example'
> or
> > > >> > 'example.C', I don't see how the compile and execute commands
are
> > even
> > > >> > touching that code. Would someone be willing to explain?
> > > >> >
> > > >> > I have a few other questions, but I'm going to read up some more
> > > >> > documentation first :-) Thanks!
> > > >> >
> > > >> > -SM
> > > >> >
> > > >> >
> > > >> > On Thu, Jun 26, 2008 at 4:45 AM, Zheng Shao <zshao@facebook.com>
> > > wrote:
> > > >> >
> > > >> > > The error message still mentions
> > > >> "/usr/java/jre1.6.0_06/lib/tools.jar".
> > > >> > > You can try changing PATH to jdk as well.
> > > >> > >
> > > >> > > Ant shouldn't be looking for files in the jre directory.
> > > >> > >
> > > >> > > By the way, are you on cygwin? Not sure if the hadoop native
lib
> > > build
> > > >> > > is supported on cygwin or not.
> > > >> > >
> > > >> > >
> > > >> > > Zheng
> > > >> > > -----Original Message-----
> > > >> > > From: Sandy [mailto:snickerdoodle08@gmail.com]
> > > >> > > Sent: Wednesday, June 25, 2008 2:31 PM
> > > >> > > To: core-user@hadoop.apache.org
> > > >> > > Subject: Re: Compiling Word Count in C++ : Hadoop Pipes
> > > >> > >
> > > >> > > My apologies. I had thought I had made that change already.
> > > >> > >
> > > >> > > Regardless, I still get the same error:
> > > >> > > $ ant -Dcompile.c++=yes compile-c++-examples
> > > >> > > Unable to locate tools.jar. Expected to find it in
> > > >> > > /usr/java/jre1.6.0_06/lib/tools.jar
> > > >> > > Buildfile: build.xml
> > > >> > >
> > > >> > > init:
> > > >> > >    [touch] Creating /tmp/null265867151
> > > >> > >   [delete] Deleting: /tmp/null265867151
> > > >> > >     [exec] svn: '.' is not a working copy
> > > >> > >     [exec] svn: '.' is not a working copy
> > > >> > >
> > > >> > > check-c++-makefiles:
> > > >> > >
> > > >> > > create-c++-examples-pipes-makefile:
> > > >> > >
> > > >> > > create-c++-pipes-makefile:
> > > >> > >
> > > >> > > create-c++-utils-makefile:
> > > >> > >
> > > >> > > BUILD FAILED
> > > >> > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:947: Execute failed:
> > > >> > > java.io.IOException: Cannot run program
> > > >> > > "/home/sjm/Desktop/hadoop-0.16.4/src/c++/utils/configure"
(in
> > > >> directory
> > > >> > >
> > > >>
> > "/home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/utils"):
> > > >> > > java.io.IOException: error=13, Permission denied
> > > >> > >
> > > >> > > Total time: 1 second
> > > >> > >
> > > >> > > My .bash_profile now contains the line
> > > >> > > JAVA_HOME=/usr/java/jdk1.6.0_06; export JAVA_HOME
> > > >> > >
> > > >> > > I then did
> > > >> > > source .bash_profile
> > > >> > > conf/hadoop-env.sh
> > > >> > >
> > > >> > > Is there anything else I need to do to make the changes
take
> > effect?
> > > >> > >
> > > >> > > Thanks again for the assistance.
> > > >> > >
> > > >> > > -SM
> > > >> > >
> > > >> > > On Wed, Jun 25, 2008 at 3:43 PM, lohit <lohit_bv@yahoo.com>
> > wrote:
> > > >> > >
> > > >> > > > may be set it to JDK home? I have set it to my JDK.
> > > >> > > >
> > > >> > > > ----- Original Message ----
> > > >> > > > From: Sandy <snickerdoodle08@gmail.com>
> > > >> > > > To: core-user@hadoop.apache.org
> > > >> > > > Sent: Wednesday, June 25, 2008 12:31:18 PM
> > > >> > > > Subject: Re: Compiling Word Count in C++ : Hadoop Pipes
> > > >> > > >
> > > >> > > > I am under the impression that it already is. As I
posted in
> my
> > > >> > > original
> > > >> > > > e-mail, here are the declarations in hadoop-env.sh
and my
> > > >> > > .bash_profile
> > > >> > > >
> > > >> > > > My hadoop-env.sh file looks something like:
> > > >> > > > # Set Hadoop-specific environment variables here.
> > > >> > > >
> > > >> > > > # The only required environment variable is JAVA_HOME.
 All
> > others
> > > >> are
> > > >> > > > # optional.  When running a distributed configuration
it is
> best
> > > to
> > > >> > > > # set JAVA_HOME in this file, so that it is correctly
defined
> on
> > > >> > > > # remote nodes.
> > > >> > > >
> > > >> > > > # The java implementation to use.  Required.
> > > >> > > > # export JAVA_HOME=$JAVA_HOME
> > > >> > > >
> > > >> > > >
> > > >> > > > and my .bash_profile file has this line in it:
> > > >> > > > JAVA_HOME=/usr/java/jre1.6.0_06; export JAVA_HOME
> > > >> > > > export PATH
> > > >> > > >
> > > >> > > >
> > > >> > > > Is there a different way I'm supposed to set the JAVA_HOME
> > > >> environment
> > > >> > > > variable?
> > > >> > > >
> > > >> > > > Much thanks,
> > > >> > > >
> > > >> > > > -SM
> > > >> > > > On Wed, Jun 25, 2008 at 3:22 PM, Zheng Shao <
> zshao@facebook.com
> > >
> > > >> > > wrote:
> > > >> > > >
> > > >> > > > > You need to set JAVA_HOME to your jdk directory
(instead of
> > > jre).
> > > >> > > > > This is required by ant.
> > > >> > > > >
> > > >> > > > > Zheng
> > > >> > > > > -----Original Message-----
> > > >> > > > > From: Sandy [mailto:snickerdoodle08@gmail.com]
> > > >> > > > > Sent: Wednesday, June 25, 2008 11:22 AM
> > > >> > > > > To: core-user@hadoop.apache.org
> > > >> > > > > Subject: Re: Compiling Word Count in C++ : Hadoop
Pipes
> > > >> > > > >
> > > >> > > > > I'm not sure how this answers my question. Could
you be more
> > > >> > > specific? I
> > > >> > > > > still am getting the above error when I type this
commmand
> in.
> > > To
> > > >> > > > > summarize:
> > > >> > > > >
> > > >> > > > > With my current setup, this occurs:
> > > >> > > > > $ ant -Dcompile.c++=yes compile-c++-examples
> > > >> > > > > Unable to locate tools.jar. Expected to find it
in
> > > >> > > > > /usr/java/jre1.6.0_06/lib/tools.jar
> > > >> > > > > Buildfile: build.xml
> > > >> > > > >
> > > >> > > > > init:
> > > >> > > > >    [touch] Creating /tmp/null2044923713
> > > >> > > > >   [delete] Deleting: /tmp/null2044923713
> > > >> > > > >     [exec] svn: '.' is not a working copy
> > > >> > > > >     [exec] svn: '.' is not a working copy
> > > >> > > > >
> > > >> > > > > check-c++-makefiles:
> > > >> > > > >
> > > >> > > > > create-c++-examples-pipes-makefile:
> > > >> > > > >    [mkdir] Created dir:
> > > >> > > > >
> > > >> > >
> > > >>
> > /home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/examples/p
> > > >> > > > > ipes
> > > >> > > > >
> > > >> > > > > BUILD FAILED
> > > >> > > > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:987:
Execute
> failed:
> > > >> > > > > java.io.IOException: Cannot run program
> > > >> > > > >
> "/home/sjm/Desktop/hadoop-0.16.4/src/examples/pipes/configure"
> > > (in
> > > >> > > > > directory
> > > >> > > > >
> > > >> > >
> > > >>
> > "/home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/examples/
> > > >> > > > > pipes"):
> > > >> > > > > java.io.IOException: error=13, Permission denied
> > > >> > > > >
> > > >> > > > > Total time: 1 second
> > > >> > > > >
> > > >> > > > > -----
> > > >> > > > >
> > > >> > > > > If I copy the tools.jar file located in my jdk's
lib folder,
> i
> > > get
> > > >> > > the
> > > >> > > > > error
> > > >> > > > > message I printed in the previous message.
> > > >> > > > >
> > > >> > > > > Could someone please tell me or suggest to me
what I am
> doing
> > > >> wrong?
> > > >> > > > >
> > > >> > > > > Thanks,
> > > >> > > > >
> > > >> > > > > -SM
> > > >> > > > >
> > > >> > > > > On Wed, Jun 25, 2008 at 1:53 PM, lohit <lohit_bv@yahoo.com>
> > > >> wrote:
> > > >> > > > >
> > > >> > > > > > ant -Dcompile.c++=yes compile-c++-examples
> > > >> > > > > > I picked it up from build.xml
> > > >> > > > > >
> > > >> > > > > > Thanks,
> > > >> > > > > > Lohit
> > > >> > > > > >
> > > >> > > > > > ----- Original Message ----
> > > >> > > > > > From: Sandy <snickerdoodle08@gmail.com>
> > > >> > > > > > To: core-user@hadoop.apache.org
> > > >> > > > > > Sent: Wednesday, June 25, 2008 10:44:20 AM
> > > >> > > > > > Subject: Compiling Word Count in C++ : Hadoop
Pipes
> > > >> > > > > >
> > > >> > > > > > Hi,
> > > >> > > > > >
> > > >> > > > > > I am currently trying to get Hadoop Pipes
working. I am
> > > >> following
> > > >> > > the
> > > >> > > > > > instructions at the hadoop wiki, where it
provides code
> for
> > a
> > > >> C++
> > > >> > > > > > implementation of Word Count (located here:
> > > >> > > > > >
> > > >> > >
> > http://wiki.apache.org/hadoop/C++WordCount?highlight=%28C%2B%2B%29)
> > > >> > > > > >
> > > >> > > > > > I am having some trouble parsing the instructions.
What
> > should
> > > >> the
> > > >> > > > > file
> > > >> > > > > > containing the new word count program be
called?
> "examples"?
> > > >> > > > > >
> > > >> > > > > > If I were to call the file "example" and
type in the
> > > following:
> > > >> > > > > > $ ant -Dcompile.c++=yes example
> > > >> > > > > > Buildfile: build.xml
> > > >> > > > > >
> > > >> > > > > > BUILD FAILED
> > > >> > > > > > Target `example' does not exist in this project.
> > > >> > > > > >
> > > >> > > > > > Total time: 0 seconds
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > > If I try and compile with "examples" as stated
on the
> wiki,
> > I
> > > >> get:
> > > >> > > > > > $ ant -Dcompile.c++=yes examples
> > > >> > > > > > Buildfile: build.xml
> > > >> > > > > >
> > > >> > > > > > clover.setup:
> > > >> > > > > >
> > > >> > > > > > clover.info:
> > > >> > > > > >     [echo]
> > > >> > > > > >     [echo]      Clover not found. Code coverage
reports
> > > >> disabled.
> > > >> > > > > >     [echo]
> > > >> > > > > >
> > > >> > > > > > clover:
> > > >> > > > > >
> > > >> > > > > > init:
> > > >> > > > > >    [touch] Creating /tmp/null810513231
> > > >> > > > > >   [delete] Deleting: /tmp/null810513231
> > > >> > > > > >     [exec] svn: '.' is not a working copy
> > > >> > > > > >     [exec] svn: '.' is not a working copy
> > > >> > > > > >
> > > >> > > > > > record-parser:
> > > >> > > > > >
> > > >> > > > > > compile-rcc-compiler:
> > > >> > > > > >    [javac] Compiling 29 source files to
> > > >> > > > > > /home/sjm/Desktop/hadoop-0.16.4/build/classes
> > > >> > > > > >
> > > >> > > > > > BUILD FAILED
> > > >> > > > > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:241:
Unable to
> > find
> > > a
> > > >> > > javac
> > > >> > > > > > compiler;
> > > >> > > > > > com.sun.tools.javac.Main is not on the classpath.
> > > >> > > > > > Perhaps JAVA_HOME does not point to the JDK
> > > >> > > > > >
> > > >> > > > > > Total time: 1 second
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > > I am a bit puzzled by this. Originally I
got the error
> that
> > > >> > > tools.jar
> > > >> > > > > was
> > > >> > > > > > not found, because it was looking for it
under
> > > >> > > > > > /usr/java/jre1.6.0_06/lib/tools.jar . There
is a tools.jar
> > > under
> > > >> > > > > > /usr/java/jdk1.6.0_06/lib/tools.jar. If I
copy this file
> > over
> > > to
> > > >> > > the
> > > >> > > > > jre
> > > >> > > > > > folder, that message goes away and its replaced
with the
> > above
> > > >> > > > > message.
> > > >> > > > > >
> > > >> > > > > > My hadoop-env.sh file looks something like:
> > > >> > > > > > # Set Hadoop-specific environment variables
here.
> > > >> > > > > >
> > > >> > > > > > # The only required environment variable
is JAVA_HOME.
>  All
> > > >> others
> > > >> > > are
> > > >> > > > > > # optional.  When running a distributed configuration
it
> is
> > > best
> > > >> > > to
> > > >> > > > > > # set JAVA_HOME in this file, so that it
is correctly
> > defined
> > > on
> > > >> > > > > > # remote nodes.
> > > >> > > > > >
> > > >> > > > > > # The java implementation to use.  Required.
> > > >> > > > > > # export JAVA_HOME=$JAVA_HOME
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > > and my .bash_profile file has this line in
it:
> > > >> > > > > > JAVA_HOME=/usr/java/jre1.6.0_06; export JAVA_HOME
> > > >> > > > > > export PATH
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > > Furthermore, if I go to the command line
and type in javac
> > > >> > > -version, I
> > > >> > > > > get:
> > > >> > > > > > $ javac -version
> > > >> > > > > > javac 1.6.0_06
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > > I also had no problem getting through the
hadoop word
> count
> > > map
> > > >> > > reduce
> > > >> > > > > > tutorial in Java. It was able to find my
java compiler
> fine.
> > > >> Could
> > > >> > > > > someone
> > > >> > > > > > please point me in the right direction? Also,
since it is
> an
> > > sh
> > > >> > > file,
> > > >> > > > > > should
> > > >> > > > > > that export line in hadoop-env.sh really
start with a hash
> > > sign?
> > > >> > > > > >
> > > >> > > > > > Thank you in advance for your assistance.
> > > >> > > > > >
> > > >> > > > > > -SM
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > >
> > > >> > > >
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > > >
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message