Return-Path: Delivered-To: apmail-hadoop-core-user-archive@www.apache.org Received: (qmail 79897 invoked from network); 23 Jun 2008 22:00:55 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 23 Jun 2008 22:00:55 -0000 Received: (qmail 14165 invoked by uid 500); 23 Jun 2008 22:00:53 -0000 Delivered-To: apmail-hadoop-core-user-archive@hadoop.apache.org Received: (qmail 13646 invoked by uid 500); 23 Jun 2008 22:00:52 -0000 Mailing-List: contact core-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-user@hadoop.apache.org Delivered-To: mailing list core-user@hadoop.apache.org Received: (qmail 13635 invoked by uid 99); 23 Jun 2008 22:00:52 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 23 Jun 2008 15:00:52 -0700 X-ASF-Spam-Status: No, hits=2.0 required=10.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of snickerdoodle08@gmail.com designates 74.125.46.30 as permitted sender) Received: from [74.125.46.30] (HELO yw-out-2324.google.com) (74.125.46.30) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 23 Jun 2008 22:00:00 +0000 Received: by yw-out-2324.google.com with SMTP id 9so1032855ywe.29 for ; Mon, 23 Jun 2008 15:00:08 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:received:message-id:date:from:to :subject:in-reply-to:mime-version:content-type:references; bh=UBV+K0P9v31WKzmEgnoxjBo3an69Mzs7cJaT7a/jWos=; b=Wuw94OL1ztdSH8Bh4H0TH0TIzUOOtqawbq5p2PwKi+TMdI52tcPxlLOo8rgeQzk6zG HVMiMaJlRx7+UUwahmBpiY/bc7a7y8U9CB8jj+p/rsL7jvAntYWdMsGLu84pJ0V0yWcw 84ZcUoLABhlsED+fyvU265lAMnJKmE3yGnXhQ= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=message-id:date:from:to:subject:in-reply-to:mime-version :content-type:references; b=YyhZApQpZgpluiBqW3LwNCXJF3d1/YdJ7LPAC8Vf6aLMwM3PibuZ/zWGam9ZxkuGgr Sw+zY0l2gGuaTMzMckGGszj3wqG5XWLq7MB9joSmDJbQDzxarVK1Ng4V41tHHsNIMNJ/ 1+3JW2QtZmANaxmfoe/NGTY6O70CbziUAU6ms= Received: by 10.125.101.11 with SMTP id d11mr1047277mkm.110.1214258405404; Mon, 23 Jun 2008 15:00:05 -0700 (PDT) Received: by 10.125.112.14 with HTTP; Mon, 23 Jun 2008 15:00:05 -0700 (PDT) Message-ID: <257c70550806231500o48c84aew46fa797ca150723d@mail.gmail.com> Date: Mon, 23 Jun 2008 18:00:05 -0400 From: Sandy To: core-user@hadoop.apache.org Subject: Re: trouble setting up hadoop In-Reply-To: <8DB4E94D-1CBA-414F-A2C8-C5C15F7729F1@101tec.com> MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_13757_29310496.1214258405401" References: <257c70550806231240k16e03c63y5695a38ee29d4d70@mail.gmail.com> <8DB4E94D-1CBA-414F-A2C8-C5C15F7729F1@101tec.com> X-Virus-Checked: Checked by ClamAV on apache.org ------=_Part_13757_29310496.1214258405401 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Content-Disposition: inline Hi Stefan, I think that did it. When I type in java -version I now get: java version "1.6.0_06" Java(TM) SE Runtime Environment (build 1.6.0_06-b02) Java HotSpot(TM) Client VM (build 10.0-b22, mixed mode, sharing) And, when I run: bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z]+' I get: 08/06/23 17:03:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= 08/06/23 17:03:13 INFO mapred.FileInputFormat: Total input paths to process : 2 08/06/23 17:03:13 INFO mapred.JobClient: Running job: job_local_1 08/06/23 17:03:13 INFO mapred.MapTask: numReduceTasks: 1 08/06/23 17:03:13 INFO mapred.LocalJobRunner: file:/home/sjm/Desktop/hadoop-0.16.4/input/hadoop-site.xml:0+178 08/06/23 17:03:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done. 08/06/23 17:03:13 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821 08/06/23 17:03:13 INFO mapred.MapTask: numReduceTasks: 1 08/06/23 17:03:13 INFO mapred.LocalJobRunner: file:/home/sjm/Desktop/hadoop-0.16.4/input/hadoop-default.xml:0+34064 08/06/23 17:03:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done. 08/06/23 17:03:13 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821 08/06/23 17:03:13 INFO mapred.LocalJobRunner: reduce > reduce 08/06/23 17:03:13 INFO mapred.TaskRunner: Task 'reduce_ov0kiq' done. 08/06/23 17:03:13 INFO mapred.TaskRunner: Saved output of task 'reduce_ov0kiq' to file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821 08/06/23 17:03:14 INFO mapred.JobClient: Job complete: job_local_1 08/06/23 17:03:14 INFO mapred.JobClient: Counters: 9 08/06/23 17:03:14 INFO mapred.JobClient: Map-Reduce Framework 08/06/23 17:03:14 INFO mapred.JobClient: Map input records=1125 08/06/23 17:03:14 INFO mapred.JobClient: Map output records=0 08/06/23 17:03:14 INFO mapred.JobClient: Map input bytes=34242 08/06/23 17:03:14 INFO mapred.JobClient: Map output bytes=0 08/06/23 17:03:14 INFO mapred.JobClient: Combine input records=0 08/06/23 17:03:14 INFO mapred.JobClient: Combine output records=0 08/06/23 17:03:14 INFO mapred.JobClient: Reduce input groups=0 08/06/23 17:03:14 INFO mapred.JobClient: Reduce input records=0 08/06/23 17:03:14 INFO mapred.JobClient: Reduce output records=0 08/06/23 17:03:14 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized 08/06/23 17:03:14 INFO mapred.FileInputFormat: Total input paths to process : 1 08/06/23 17:03:14 INFO mapred.JobClient: Running job: job_local_2 08/06/23 17:03:14 INFO mapred.MapTask: numReduceTasks: 1 08/06/23 17:03:14 INFO mapred.LocalJobRunner: file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821/part-00000:0+86 08/06/23 17:03:14 INFO mapred.TaskRunner: Task 'job_local_2_map_0000' done. 08/06/23 17:03:14 INFO mapred.TaskRunner: Saved output of task 'job_local_2_map_0000' to file:/home/sjm/Desktop/hadoop-0.16.4/output 08/06/23 17:03:14 INFO mapred.LocalJobRunner: reduce > reduce 08/06/23 17:03:14 INFO mapred.TaskRunner: Task 'reduce_448bva' done. 08/06/23 17:03:14 INFO mapred.TaskRunner: Saved output of task 'reduce_448bva' to file:/home/sjm/Desktop/hadoop-0.16.4/output 08/06/23 17:03:15 INFO mapred.JobClient: Job complete: job_local_2 08/06/23 17:03:15 INFO mapred.JobClient: Counters: 9 08/06/23 17:03:15 INFO mapred.JobClient: Map-Reduce Framework 08/06/23 17:03:15 INFO mapred.JobClient: Map input records=0 08/06/23 17:03:15 INFO mapred.JobClient: Map output records=0 08/06/23 17:03:15 INFO mapred.JobClient: Map input bytes=0 08/06/23 17:03:15 INFO mapred.JobClient: Map output bytes=0 08/06/23 17:03:15 INFO mapred.JobClient: Combine input records=0 08/06/23 17:03:15 INFO mapred.JobClient: Combine output records=0 08/06/23 17:03:15 INFO mapred.JobClient: Reduce input groups=0 08/06/23 17:03:15 INFO mapred.JobClient: Reduce input records=0 08/06/23 17:03:15 INFO mapred.JobClient: Reduce output records=0 Does this all look correct? If so, thank you so much. I really appreciate all the help! -SM On Mon, Jun 23, 2008 at 4:32 PM, Stefan Groschupf wrote: > Looks like you have not install a correct java. > Make sure you have a sun java installed on your nodes and java is in your > path as well JAVA_HOME should be set. > I think gnu.gcj is the gnu java compiler but not a java you need to run > hadoop. > Check on command line this: > $ java -version > you should see something like this: > java version "1.5.0_13" > Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_13-b05-237) > Java HotSpot(TM) Client VM (build 1.5.0_13-119, mixed mode, sharing) > > HTH > > > > On Jun 23, 2008, at 9:40 PM, Sandy wrote: > > I apologize for the severe basicness of this error, but I am in the >> process >> of getting hadoop set up. I have been following the instructions in the >> Hadoop quickstart. I have confirmed that bin/hadoop will give me help >> usage >> information. >> >> I am now in the stage of standalone operation. >> >> I typed in: >> mkdir input >> cp conf/*.xml input >> bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+' >> >> at which point I get: >> Exception in thread "main" java.lang.ClassNotFoundException: >> java.lang.Iterable not found in >> gnu.gcj.runtime.SystemClassLoader{urls=[file:/home/sjm/Desktop/hado >> >> op-0.16.4/bin/../conf/,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../,file:/home/s >> >> jm/Desktop/hadoop-0.16.4/bin/../hadoop-0.16.4-core.jar,file:/home/sjm/Desktop/ha >> >> doop-0.16.4/bin/../lib/commons-cli-2.0-SNAPSHOT.jar,file:/home/sjm/Desktop/hadoo >> >> p-0.16.4/bin/../lib/commons-codec-1.3.jar,file:/home/sjm/Desktop/hadoop-0.16.4/b >> >> in/../lib/commons-httpclient-3.0.1.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/ >> >> ../lib/commons-logging-1.0.4.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib >> >> /commons-logging-api-1.0.4.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/j >> >> ets3t-0.5.0.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-5.1.4.jar, >> >> file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/junit-3.8.1.jar,file:/home/sjm/D >> >> esktop/hadoop-0.16.4/bin/../lib/kfs-0.1.jar,file:/home/sjm/Desktop/hadoop-0.16.4 >> >> /bin/../lib/log4j-1.2.13.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/ser >> >> vlet-api.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/xmlenc-0.52.jar,fil >> >> e:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/commons-el.jar,file:/home >> >> /sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper-compiler.jar,file:/home/s >> >> jm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper-runtime.jar,file:/home/sjm/ >> Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jsp-api.jar], >> parent=gnu.gcj.runtime. ExtensionClassLoader{urls=[], parent=null}} >> at java.net.URLClassLoader.findClass (libgcj.so.7) >> at java.lang.ClassLoader.loadClass (libgcj.so.7) >> at java.lang.ClassLoader.loadClass (libgcj.so.7) >> at java.lang.VMClassLoader.defineClass (libgcj.so.7) >> at java.lang.ClassLoader.defineClass (libgcj.so.7) >> at java.security.SecureClassLoader.defineClass (libgcj.so.7) >> at java.net.URLClassLoader.findClass (libgcj.so.7) >> at java.lang.ClassLoader.loadClass (libgcj.so.7) >> at java.lang.ClassLoader.loadClass (libgcj.so.7) >> at org.apache.hadoop.util.RunJar.main (RunJar.java:107) >> >> I suspect the issue is path related, though I am not certain. Could >> someone >> please point me in the right direction? >> >> Much thanks, >> >> SM >> > > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ > 101tec Inc. > Menlo Park, California, USA > http://www.101tec.com > > > ------=_Part_13757_29310496.1214258405401--