Return-Path: Delivered-To: apmail-hadoop-core-user-archive@www.apache.org Received: (qmail 26337 invoked from network); 12 Jul 2008 06:14:32 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 12 Jul 2008 06:14:32 -0000 Received: (qmail 62414 invoked by uid 500); 12 Jul 2008 06:14:29 -0000 Delivered-To: apmail-hadoop-core-user-archive@hadoop.apache.org Received: (qmail 62380 invoked by uid 500); 12 Jul 2008 06:14:29 -0000 Mailing-List: contact core-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-user@hadoop.apache.org Delivered-To: mailing list core-user@hadoop.apache.org Received: (qmail 62369 invoked by uid 99); 12 Jul 2008 06:14:29 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 11 Jul 2008 23:14:29 -0700 X-ASF-Spam-Status: No, hits=2.0 required=10.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of chaitanyavv.iiith@gmail.com designates 74.125.46.29 as permitted sender) Received: from [74.125.46.29] (HELO yw-out-2324.google.com) (74.125.46.29) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 12 Jul 2008 06:13:33 +0000 Received: by yw-out-2324.google.com with SMTP id 9so2437746ywe.29 for ; Fri, 11 Jul 2008 23:13:55 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:received:message-id:date:from:to :subject:in-reply-to:mime-version:content-type:references; bh=8eRviwusf9XJlOERt5/W7LZd7kWJ1flnVoR4A1zhf3s=; b=SShSObbLDaxm+AnC6mYzNRE1oSOAaMnWRwPXhaK6XvCJdm2yptc0sT+IyHxumbMwez 0R+hZ7U6IZx8mwUi7ESA9XOiKIR36tgJda8MJ8DXc7kR6nJYyxBPemmsiwyy/RquFO80 avSBjZ0x8hVZyGOKG+g1oKZ9ag8FFV0/gydYc= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=message-id:date:from:to:subject:in-reply-to:mime-version :content-type:references; b=Lv8BSSCbbAWPYM8Axr3jjnIaZb6Fy4w5PQu90OLEB6FB+rn+0zyt67m0BmE1+wIArm X6teTir2RllOmEiyaFEXgzO/+DWJf9MIKWIuUMeogBrd4HDtLLk0Qx66Bi8bUjNp6HOW eRMtIt85PLGKh7JVJFfTtm4TSr7iXQfuhw4yI= Received: by 10.151.150.13 with SMTP id c13mr17120686ybo.155.1215843235383; Fri, 11 Jul 2008 23:13:55 -0700 (PDT) Received: by 10.151.60.3 with HTTP; Fri, 11 Jul 2008 23:13:55 -0700 (PDT) Message-ID: <1e92a23c0807112313l1e0f7c04t591232975873047c@mail.gmail.com> Date: Sat, 12 Jul 2008 11:43:55 +0530 From: "chaitanya krishna" To: core-user@hadoop.apache.org Subject: Re: Compiling Word Count in C++ : Hadoop Pipes In-Reply-To: <257c70550807110934n1a96ead7h4acbf39afd5c7a8d@mail.gmail.com> MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_30734_23012183.1215843235324" References: <14134.8183.qm@web53606.mail.re2.yahoo.com> <257c70550806251431w474772c3w28619e3a7aa9284e@mail.gmail.com> <257c70550806260919v550628fbw2d515b1b0ec3769a@mail.gmail.com> <1e92a23c0807101139l99082e6y49d602f402f6a547@mail.gmail.com> <257c70550807101402r3fc899eeve53c43d6b2909300@mail.gmail.com> <257c70550807101405y1bd3e493lde0db3a136bce6b@mail.gmail.com> <1e92a23c0807102003o7a86155dx1100308f57289d7b@mail.gmail.com> <257c70550807110934n1a96ead7h4acbf39afd5c7a8d@mail.gmail.com> X-Virus-Checked: Checked by ClamAV on apache.org ------=_Part_30734_23012183.1215843235324 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Content-Disposition: inline Thanks a lot for the reply! I'll try to sort out the permission issues. Hopefully, it should work then. On Fri, Jul 11, 2008 at 10:04 PM, Sandy wrote: > hadoop-0.17.0 should work. I took a closer look at your error message. It > seems you need to change permission on some of your files > > Try: > > chmod 755 /home/jobs/hadoop-0.17.0/src/examples/pipes/configure > > > At this point you probably will get another "build failed" message, because > you need to do the same thing on another file (I don't remember it off the > top of my head). But you can find it out by inspecting this part of the > error message: > > BUILD FAILED > /home/jobs/hadoop-0.17.0/build.xml:1040: Execute failed: > java.io.IOException: Cannot run program > "/home/jobs/hadoop-0.17.0/src/examples/pipes/configure" (in directory > "/home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/examples/pipes"): > java.io.IOException: error=13, Permission denied > > This means that the file that you can't run is: > "/home/jobs/hadoop-0.17.0/src/examples/pipes/configure" > > due to permission issues. a chmod 755 will fix this. you'll need to do this > with any "permission denied" message that you get associated with this. > > Hope this helps! > > -SM > On Thu, Jul 10, 2008 at 10:03 PM, chaitanya krishna < > chaitanyavv.iiith@gmail.com> wrote: > > > I'm using hadoop-0.17.0. Should I be using a more latest version? > > Please tell me which version did you use? > > > > On Fri, Jul 11, 2008 at 2:35 AM, Sandy > wrote: > > > > > One last thing: > > > > > > If that doesn't work, try following the instructions on the ubuntu > > setting > > > up hadoop tutorial. Even if you aren't running ubuntu, I think it may > be > > > possible to use those instructions to set up things properly. That's > what > > I > > > eventually did. > > > > > > Link is here: > > > > > > > > > http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster) > < > http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29 > > > > < > > > http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29 > > > > > > > > > -SM > > > > > > On Thu, Jul 10, 2008 at 4:02 PM, Sandy > > wrote: > > > > > > > So, I had run into a similar issue. What version of Hadoop are you > > using? > > > > > > > > Make sure you are using the latest version of hadoop. That actually > > fixed > > > > it for me. There was something wrong with the build.xml file in > earlier > > > > versions that prevented me from being able to get it to work > properly. > > > Once > > > > I upgraded to the latest, it went away. > > > > > > > > Hope this helps! > > > > > > > > -SM > > > > > > > > > > > > On Thu, Jul 10, 2008 at 1:39 PM, chaitanya krishna < > > > > chaitanyavv.iiith@gmail.com> wrote: > > > > > > > >> Hi, > > > >> > > > >> I faced the similar problem as Sandy. But this time I even had the > > jdk > > > >> set > > > >> properly. > > > >> > > > >> when i executed: > > > >> ant -Dcompile.c++=yes examples > > > >> > > > >> the following was displayed: > > > >> > > > >> Buildfile: build.xml > > > >> > > > >> clover.setup: > > > >> > > > >> clover.info: > > > >> [echo] > > > >> [echo] Clover not found. Code coverage reports disabled. > > > >> [echo] > > > >> > > > >> clover: > > > >> > > > >> init: > > > >> [touch] Creating /tmp/null358480626 > > > >> [delete] Deleting: /tmp/null358480626 > > > >> [exec] svn: '.' is not a working copy > > > >> [exec] svn: '.' is not a working copy > > > >> > > > >> record-parser: > > > >> > > > >> compile-rcc-compiler: > > > >> > > > >> compile-core-classes: > > > >> [javac] Compiling 2 source files to > > > >> /home/jobs/hadoop-0.17.0/build/classes > > > >> > > > >> compile-core-native: > > > >> > > > >> check-c++-makefiles: > > > >> > > > >> create-c++-pipes-makefile: > > > >> > > > >> BUILD FAILED > > > >> /home/jobs/hadoop-0.17.0/build.xml:1017: Execute failed: > > > >> java.io.IOException: Cannot run program > > > >> "/home/jobs/hadoop-0.17.0/src/c++/pipes/configure" (in directory > > > >> "/home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/pipes"): > > > >> java.io.IOException: error=13, Permission denied > > > >> > > > >> > > > >> > > > >> when,as suggested by Lohith, following was executed, > > > >> > > > >> ant -Dcompile.c++=yes compile-c++-examples > > > >> > > > >> the following was displayed > > > >> Buildfile: build.xml > > > >> > > > >> init: > > > >> [touch] Creating /tmp/null1037468845 > > > >> [delete] Deleting: /tmp/null1037468845 > > > >> [exec] svn: '.' is not a working copy > > > >> [exec] svn: '.' is not a working copy > > > >> > > > >> check-c++-makefiles: > > > >> > > > >> create-c++-examples-pipes-makefile: > > > >> [mkdir] Created dir: > > > >> > /home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/examples/pipes > > > >> > > > >> BUILD FAILED > > > >> /home/jobs/hadoop-0.17.0/build.xml:1040: Execute failed: > > > >> java.io.IOException: Cannot run program > > > >> "/home/jobs/hadoop-0.17.0/src/examples/pipes/configure" (in > directory > > > >> > > > > "/home/jobs/hadoop-0.17.0/build/c++-build/Linux-i386-32/examples/pipes"): > > > >> java.io.IOException: error=13, Permission denied > > > >> > > > >> Total time: 0 seconds > > > >> > > > >> > > > >> Please help me out with this problem. > > > >> > > > >> Thank you. > > > >> > > > >> V.V.Chaitanya Krishna > > > >> > > > >> > > > >> On Thu, Jun 26, 2008 at 9:49 PM, Sandy > > > wrote: > > > >> > > > >> > Thanks for the suggestion! That fixed it! My .bash_profile now > looks > > > >> like: > > > >> > > > > >> > PATH=$PATH:$HOME/bin:/usr/java/jdk1.6.0_06 > > > >> > JAVA_HOME=/usr/java/jdk1.6.0_06; export JAVA_HOME > > > >> > > > > >> > Thanks so much for the help! > > > >> > > > > >> > > > > >> > I do have a couple more questions about getting the word count > > example > > > >> to > > > >> > work... If I copy and paste the code into a file called 'example' > or > > > >> > 'example.C', I don't see how the compile and execute commands are > > even > > > >> > touching that code. Would someone be willing to explain? > > > >> > > > > >> > I have a few other questions, but I'm going to read up some more > > > >> > documentation first :-) Thanks! > > > >> > > > > >> > -SM > > > >> > > > > >> > > > > >> > On Thu, Jun 26, 2008 at 4:45 AM, Zheng Shao > > > wrote: > > > >> > > > > >> > > The error message still mentions > > > >> "/usr/java/jre1.6.0_06/lib/tools.jar". > > > >> > > You can try changing PATH to jdk as well. > > > >> > > > > > >> > > Ant shouldn't be looking for files in the jre directory. > > > >> > > > > > >> > > By the way, are you on cygwin? Not sure if the hadoop native lib > > > build > > > >> > > is supported on cygwin or not. > > > >> > > > > > >> > > > > > >> > > Zheng > > > >> > > -----Original Message----- > > > >> > > From: Sandy [mailto:snickerdoodle08@gmail.com] > > > >> > > Sent: Wednesday, June 25, 2008 2:31 PM > > > >> > > To: core-user@hadoop.apache.org > > > >> > > Subject: Re: Compiling Word Count in C++ : Hadoop Pipes > > > >> > > > > > >> > > My apologies. I had thought I had made that change already. > > > >> > > > > > >> > > Regardless, I still get the same error: > > > >> > > $ ant -Dcompile.c++=yes compile-c++-examples > > > >> > > Unable to locate tools.jar. Expected to find it in > > > >> > > /usr/java/jre1.6.0_06/lib/tools.jar > > > >> > > Buildfile: build.xml > > > >> > > > > > >> > > init: > > > >> > > [touch] Creating /tmp/null265867151 > > > >> > > [delete] Deleting: /tmp/null265867151 > > > >> > > [exec] svn: '.' is not a working copy > > > >> > > [exec] svn: '.' is not a working copy > > > >> > > > > > >> > > check-c++-makefiles: > > > >> > > > > > >> > > create-c++-examples-pipes-makefile: > > > >> > > > > > >> > > create-c++-pipes-makefile: > > > >> > > > > > >> > > create-c++-utils-makefile: > > > >> > > > > > >> > > BUILD FAILED > > > >> > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:947: Execute failed: > > > >> > > java.io.IOException: Cannot run program > > > >> > > "/home/sjm/Desktop/hadoop-0.16.4/src/c++/utils/configure" (in > > > >> directory > > > >> > > > > > >> > > "/home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/utils"): > > > >> > > java.io.IOException: error=13, Permission denied > > > >> > > > > > >> > > Total time: 1 second > > > >> > > > > > >> > > My .bash_profile now contains the line > > > >> > > JAVA_HOME=/usr/java/jdk1.6.0_06; export JAVA_HOME > > > >> > > > > > >> > > I then did > > > >> > > source .bash_profile > > > >> > > conf/hadoop-env.sh > > > >> > > > > > >> > > Is there anything else I need to do to make the changes take > > effect? > > > >> > > > > > >> > > Thanks again for the assistance. > > > >> > > > > > >> > > -SM > > > >> > > > > > >> > > On Wed, Jun 25, 2008 at 3:43 PM, lohit > > wrote: > > > >> > > > > > >> > > > may be set it to JDK home? I have set it to my JDK. > > > >> > > > > > > >> > > > ----- Original Message ---- > > > >> > > > From: Sandy > > > >> > > > To: core-user@hadoop.apache.org > > > >> > > > Sent: Wednesday, June 25, 2008 12:31:18 PM > > > >> > > > Subject: Re: Compiling Word Count in C++ : Hadoop Pipes > > > >> > > > > > > >> > > > I am under the impression that it already is. As I posted in > my > > > >> > > original > > > >> > > > e-mail, here are the declarations in hadoop-env.sh and my > > > >> > > .bash_profile > > > >> > > > > > > >> > > > My hadoop-env.sh file looks something like: > > > >> > > > # Set Hadoop-specific environment variables here. > > > >> > > > > > > >> > > > # The only required environment variable is JAVA_HOME. All > > others > > > >> are > > > >> > > > # optional. When running a distributed configuration it is > best > > > to > > > >> > > > # set JAVA_HOME in this file, so that it is correctly defined > on > > > >> > > > # remote nodes. > > > >> > > > > > > >> > > > # The java implementation to use. Required. > > > >> > > > # export JAVA_HOME=$JAVA_HOME > > > >> > > > > > > >> > > > > > > >> > > > and my .bash_profile file has this line in it: > > > >> > > > JAVA_HOME=/usr/java/jre1.6.0_06; export JAVA_HOME > > > >> > > > export PATH > > > >> > > > > > > >> > > > > > > >> > > > Is there a different way I'm supposed to set the JAVA_HOME > > > >> environment > > > >> > > > variable? > > > >> > > > > > > >> > > > Much thanks, > > > >> > > > > > > >> > > > -SM > > > >> > > > On Wed, Jun 25, 2008 at 3:22 PM, Zheng Shao < > zshao@facebook.com > > > > > > >> > > wrote: > > > >> > > > > > > >> > > > > You need to set JAVA_HOME to your jdk directory (instead of > > > jre). > > > >> > > > > This is required by ant. > > > >> > > > > > > > >> > > > > Zheng > > > >> > > > > -----Original Message----- > > > >> > > > > From: Sandy [mailto:snickerdoodle08@gmail.com] > > > >> > > > > Sent: Wednesday, June 25, 2008 11:22 AM > > > >> > > > > To: core-user@hadoop.apache.org > > > >> > > > > Subject: Re: Compiling Word Count in C++ : Hadoop Pipes > > > >> > > > > > > > >> > > > > I'm not sure how this answers my question. Could you be more > > > >> > > specific? I > > > >> > > > > still am getting the above error when I type this commmand > in. > > > To > > > >> > > > > summarize: > > > >> > > > > > > > >> > > > > With my current setup, this occurs: > > > >> > > > > $ ant -Dcompile.c++=yes compile-c++-examples > > > >> > > > > Unable to locate tools.jar. Expected to find it in > > > >> > > > > /usr/java/jre1.6.0_06/lib/tools.jar > > > >> > > > > Buildfile: build.xml > > > >> > > > > > > > >> > > > > init: > > > >> > > > > [touch] Creating /tmp/null2044923713 > > > >> > > > > [delete] Deleting: /tmp/null2044923713 > > > >> > > > > [exec] svn: '.' is not a working copy > > > >> > > > > [exec] svn: '.' is not a working copy > > > >> > > > > > > > >> > > > > check-c++-makefiles: > > > >> > > > > > > > >> > > > > create-c++-examples-pipes-makefile: > > > >> > > > > [mkdir] Created dir: > > > >> > > > > > > > >> > > > > > >> > > /home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/examples/p > > > >> > > > > ipes > > > >> > > > > > > > >> > > > > BUILD FAILED > > > >> > > > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:987: Execute > failed: > > > >> > > > > java.io.IOException: Cannot run program > > > >> > > > > > "/home/sjm/Desktop/hadoop-0.16.4/src/examples/pipes/configure" > > > (in > > > >> > > > > directory > > > >> > > > > > > > >> > > > > > >> > > "/home/sjm/Desktop/hadoop-0.16.4/build/c++-build/Linux-i386-32/examples/ > > > >> > > > > pipes"): > > > >> > > > > java.io.IOException: error=13, Permission denied > > > >> > > > > > > > >> > > > > Total time: 1 second > > > >> > > > > > > > >> > > > > ----- > > > >> > > > > > > > >> > > > > If I copy the tools.jar file located in my jdk's lib folder, > i > > > get > > > >> > > the > > > >> > > > > error > > > >> > > > > message I printed in the previous message. > > > >> > > > > > > > >> > > > > Could someone please tell me or suggest to me what I am > doing > > > >> wrong? > > > >> > > > > > > > >> > > > > Thanks, > > > >> > > > > > > > >> > > > > -SM > > > >> > > > > > > > >> > > > > On Wed, Jun 25, 2008 at 1:53 PM, lohit > > > >> wrote: > > > >> > > > > > > > >> > > > > > ant -Dcompile.c++=yes compile-c++-examples > > > >> > > > > > I picked it up from build.xml > > > >> > > > > > > > > >> > > > > > Thanks, > > > >> > > > > > Lohit > > > >> > > > > > > > > >> > > > > > ----- Original Message ---- > > > >> > > > > > From: Sandy > > > >> > > > > > To: core-user@hadoop.apache.org > > > >> > > > > > Sent: Wednesday, June 25, 2008 10:44:20 AM > > > >> > > > > > Subject: Compiling Word Count in C++ : Hadoop Pipes > > > >> > > > > > > > > >> > > > > > Hi, > > > >> > > > > > > > > >> > > > > > I am currently trying to get Hadoop Pipes working. I am > > > >> following > > > >> > > the > > > >> > > > > > instructions at the hadoop wiki, where it provides code > for > > a > > > >> C++ > > > >> > > > > > implementation of Word Count (located here: > > > >> > > > > > > > > >> > > > > http://wiki.apache.org/hadoop/C++WordCount?highlight=%28C%2B%2B%29) > > > >> > > > > > > > > >> > > > > > I am having some trouble parsing the instructions. What > > should > > > >> the > > > >> > > > > file > > > >> > > > > > containing the new word count program be called? > "examples"? > > > >> > > > > > > > > >> > > > > > If I were to call the file "example" and type in the > > > following: > > > >> > > > > > $ ant -Dcompile.c++=yes example > > > >> > > > > > Buildfile: build.xml > > > >> > > > > > > > > >> > > > > > BUILD FAILED > > > >> > > > > > Target `example' does not exist in this project. > > > >> > > > > > > > > >> > > > > > Total time: 0 seconds > > > >> > > > > > > > > >> > > > > > > > > >> > > > > > If I try and compile with "examples" as stated on the > wiki, > > I > > > >> get: > > > >> > > > > > $ ant -Dcompile.c++=yes examples > > > >> > > > > > Buildfile: build.xml > > > >> > > > > > > > > >> > > > > > clover.setup: > > > >> > > > > > > > > >> > > > > > clover.info: > > > >> > > > > > [echo] > > > >> > > > > > [echo] Clover not found. Code coverage reports > > > >> disabled. > > > >> > > > > > [echo] > > > >> > > > > > > > > >> > > > > > clover: > > > >> > > > > > > > > >> > > > > > init: > > > >> > > > > > [touch] Creating /tmp/null810513231 > > > >> > > > > > [delete] Deleting: /tmp/null810513231 > > > >> > > > > > [exec] svn: '.' is not a working copy > > > >> > > > > > [exec] svn: '.' is not a working copy > > > >> > > > > > > > > >> > > > > > record-parser: > > > >> > > > > > > > > >> > > > > > compile-rcc-compiler: > > > >> > > > > > [javac] Compiling 29 source files to > > > >> > > > > > /home/sjm/Desktop/hadoop-0.16.4/build/classes > > > >> > > > > > > > > >> > > > > > BUILD FAILED > > > >> > > > > > /home/sjm/Desktop/hadoop-0.16.4/build.xml:241: Unable to > > find > > > a > > > >> > > javac > > > >> > > > > > compiler; > > > >> > > > > > com.sun.tools.javac.Main is not on the classpath. > > > >> > > > > > Perhaps JAVA_HOME does not point to the JDK > > > >> > > > > > > > > >> > > > > > Total time: 1 second > > > >> > > > > > > > > >> > > > > > > > > >> > > > > > > > > >> > > > > > I am a bit puzzled by this. Originally I got the error > that > > > >> > > tools.jar > > > >> > > > > was > > > >> > > > > > not found, because it was looking for it under > > > >> > > > > > /usr/java/jre1.6.0_06/lib/tools.jar . There is a tools.jar > > > under > > > >> > > > > > /usr/java/jdk1.6.0_06/lib/tools.jar. If I copy this file > > over > > > to > > > >> > > the > > > >> > > > > jre > > > >> > > > > > folder, that message goes away and its replaced with the > > above > > > >> > > > > message. > > > >> > > > > > > > > >> > > > > > My hadoop-env.sh file looks something like: > > > >> > > > > > # Set Hadoop-specific environment variables here. > > > >> > > > > > > > > >> > > > > > # The only required environment variable is JAVA_HOME. > All > > > >> others > > > >> > > are > > > >> > > > > > # optional. When running a distributed configuration it > is > > > best > > > >> > > to > > > >> > > > > > # set JAVA_HOME in this file, so that it is correctly > > defined > > > on > > > >> > > > > > # remote nodes. > > > >> > > > > > > > > >> > > > > > # The java implementation to use. Required. > > > >> > > > > > # export JAVA_HOME=$JAVA_HOME > > > >> > > > > > > > > >> > > > > > > > > >> > > > > > and my .bash_profile file has this line in it: > > > >> > > > > > JAVA_HOME=/usr/java/jre1.6.0_06; export JAVA_HOME > > > >> > > > > > export PATH > > > >> > > > > > > > > >> > > > > > > > > >> > > > > > Furthermore, if I go to the command line and type in javac > > > >> > > -version, I > > > >> > > > > get: > > > >> > > > > > $ javac -version > > > >> > > > > > javac 1.6.0_06 > > > >> > > > > > > > > >> > > > > > > > > >> > > > > > I also had no problem getting through the hadoop word > count > > > map > > > >> > > reduce > > > >> > > > > > tutorial in Java. It was able to find my java compiler > fine. > > > >> Could > > > >> > > > > someone > > > >> > > > > > please point me in the right direction? Also, since it is > an > > > sh > > > >> > > file, > > > >> > > > > > should > > > >> > > > > > that export line in hadoop-env.sh really start with a hash > > > sign? > > > >> > > > > > > > > >> > > > > > Thank you in advance for your assistance. > > > >> > > > > > > > > >> > > > > > -SM > > > >> > > > > > > > > >> > > > > > > > > >> > > > > > > > >> > > > > > > >> > > > > > > >> > > > > > >> > > > > >> > > > > > > > > > > > > > > ------=_Part_30734_23012183.1215843235324--