Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 052FEDBC3 for ; Thu, 14 Mar 2013 16:31:37 +0000 (UTC) Received: (qmail 78555 invoked by uid 500); 14 Mar 2013 16:31:31 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 78461 invoked by uid 500); 14 Mar 2013 16:31:31 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 78450 invoked by uid 99); 14 Mar 2013 16:31:31 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 14 Mar 2013 16:31:31 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of cyrilbogus@gmail.com designates 209.85.128.175 as permitted sender) Received: from [209.85.128.175] (HELO mail-ve0-f175.google.com) (209.85.128.175) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 14 Mar 2013 16:31:24 +0000 Received: by mail-ve0-f175.google.com with SMTP id cy12so1873311veb.34 for ; Thu, 14 Mar 2013 09:31:03 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=OEZftK8FKeKlPYfAGUf5JryjsXWcTk4KR6Rg8pNLH/I=; b=uhxBcxBq+fdZMI2HVlJ1GpXug5zPbY99BOkWqvihISs75Zm1MabUG7Fc2Z+aAthZpA D4d6UZhgL0LzKDSGaEWTC0xo5DSOwe2okAZJmlPuH/azIt4koJ5vBneWqlUsdzNrMpeN lDcN6ijeVJ+lDLQGVn27EzpvHVybeW4kWWm5GBYReElcSNwKhzTBJPR/R5jiLL/AOWUC EILYjPC5lcyuLdv4dgFm5I39Pg8/Adtun5QWM/YaNTdOKnH4zOIcYE+OIFioM1y6hvQc Udqx86mlruOXvY9NTpIOMuVpAPQmKNcU9UxXOuifmjmRCVBQuwn2E/DhK0RGY6noO9P5 TLKA== MIME-Version: 1.0 X-Received: by 10.220.103.7 with SMTP id i7mr2514326vco.7.1363278663711; Thu, 14 Mar 2013 09:31:03 -0700 (PDT) Received: by 10.58.246.42 with HTTP; Thu, 14 Mar 2013 09:31:03 -0700 (PDT) In-Reply-To: References: Date: Thu, 14 Mar 2013 12:31:03 -0400 Message-ID: Subject: Re: Small cluster setup. From: Cyril Bogus To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b3438b47afff204d7e50c73 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b3438b47afff204d7e50c73 Content-Type: text/plain; charset=ISO-8859-1 Thankk you that solved the problem. But I don't understand I have set up the env variables in bash for hadoop home, why is it looking for the old location. Cyril On Thu, Mar 14, 2013 at 9:53 AM, hadoop hive wrote: > check your hadoop-env.sh file, set hadoop_home path correctly. > > On Wed, Mar 13, 2013 at 7:51 PM, Cyril Bogus wrote: > >> Hi Nitin, >> >> As part of my configuration I have set all the environment variables AND >> added HADOOP_PREFIX. But the problem still persist so I will just keep the >> extra copy in order to move forward. >> >> And I start it from /home/hadoop. I have no idea why it is looking for >> /home/agnik/hadoop >> >> >> On Wed, Mar 13, 2013 at 10:03 AM, Nitin Pawar wrote: >> >>> or set this variabole HADOOP_PREFIX to the directory where hadoop is >>> installed. >>> >>> >>> On Wed, Mar 13, 2013 at 7:32 PM, Nitin Pawar wrote: >>> >>>> Cyril, >>>> >>>> how did you install hadoop? >>>> when you start hadoop ... do you start it from the location where it is >>>> installed or from users home directory? >>>> >>>> try setting HADOOP_HOME (its deprecated but it helps to resolve issues >>>> like where the config files are located etc) >>>> >>>> >>>> On Wed, Mar 13, 2013 at 7:29 PM, Cyril Bogus wrote: >>>> >>>>> Thanks for the reply. >>>>> >>>>> I am running on Linux. The problem IS the config file but since I >>>>> couldn't figure out where, I made two copies of hadoop. One where it is >>>>> looking for it, and where other nodes will be looking for it. >>>>> >>>>> In my config file, everything is I have set hadoop in /home/hadoop but >>>>> it is looking for it in /home/owner/hadoop (for some processes at least) >>>>> and that was the problem. >>>>> >>>>> >>>>> On Tue, Mar 12, 2013 at 11:06 AM, Jean-Marc Spaggiari < >>>>> jean-marc@spaggiari.org> wrote: >>>>> >>>>>> Hi Cyril, >>>>>> >>>>>> Are you running in Cygwin? Or Linux? Also, you might want to share >>>>>> your configuration files if you want someone to take a look. >>>>>> >>>>>> JM >>>>>> >>>>>> 2013/3/8 Cyril Bogus : >>>>>> > I am trying to have a two node cluster. >>>>>> > Hadoop 1.0.4 >>>>>> > >>>>>> > the master is under the account A >>>>>> > the slave is under the account B >>>>>> > >>>>>> > I run into an exception when I try to start the cluster. >>>>>> > Here is my output >>>>>> > >>>>>> > A@owner-5:~$ /home/hadoop/bin/start-all.sh >>>>>> > Warning: $HADOOP_HOME is deprecated. >>>>>> > >>>>>> > namenode running as process 3020. Stop it first. >>>>>> > A@master: starting datanode, logging to >>>>>> > /home/hadoop/libexec/../logs/hadoop-A-datanode-owner-5.out >>>>>> > B@slave: Warning: $HADOOP_HOME is deprecated. >>>>>> > B@slave: >>>>>> > B@slave: starting datanode, logging to >>>>>> > /home/owner/hadoop//logs/hadoop-owner-datanode-owner-7.out >>>>>> > B@slave: Exception in thread "main" java.lang.NoClassDefFoundError: >>>>>> > org/apache/hadoop/util/PlatformName >>>>>> > B@slave: Caused by: java.lang.ClassNotFoundException: >>>>>> > org.apache.hadoop.util.PlatformName >>>>>> > B@slave: at >>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:202) >>>>>> > B@slave: at java.security.AccessController.doPrivileged(Native >>>>>> Method) >>>>>> > B@slave: at >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190) >>>>>> > B@slave: at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306) >>>>>> > B@slave: at >>>>>> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >>>>>> > B@slave: at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) >>>>>> > B@slave: Could not find the main class: >>>>>> org.apache.hadoop.util.PlatformName. >>>>>> > Program will exit. >>>>>> > B@slave: Exception in thread "main" java.lang.NoClassDefFoundError: >>>>>> > org/apache/hadoop/hdfs/server/datanode/DataNode >>>>>> > A@master: secondarynamenode running as process 3352. Stop it first. >>>>>> > starting jobtracker, logging to >>>>>> > /home/hadoop/libexec/../logs/hadoop-A-jobtracker-owner-5.out >>>>>> > B@slave: Warning: $HADOOP_HOME is deprecated. >>>>>> > B@slave: >>>>>> > B@slave: starting tasktracker, logging to >>>>>> > /home/owner/hadoop//logs/hadoop-owner-tasktracker-owner-7.out >>>>>> > A@master: starting tasktracker, logging to >>>>>> > /home/hadoop/libexec/../logs/hadoop-A-tasktracker-owner-5.out >>>>>> > B@slave: Exception in thread "main" java.lang.NoClassDefFoundError: >>>>>> > org/apache/hadoop/util/PlatformName >>>>>> > B@slave: Caused by: java.lang.ClassNotFoundException: >>>>>> > org.apache.hadoop.util.PlatformName >>>>>> > B@slave: at >>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:202) >>>>>> > B@slave: at java.security.AccessController.doPrivileged(Native >>>>>> Method) >>>>>> > B@slave: at >>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190) >>>>>> > B@slave: at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306) >>>>>> > B@slave: at >>>>>> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) >>>>>> > B@slave: at >>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) >>>>>> > B@slave: Could not find the main class: >>>>>> org.apache.hadoop.util.PlatformName. >>>>>> > Program will exit. >>>>>> > B@slave: Exception in thread "main" java.lang.NoClassDefFoundError: >>>>>> > org/apache/hadoop/mapred/TaskTracker >>>>>> > A@owner-5:~$ >>>>>> > >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> Nitin Pawar >>>> >>> >>> >>> >>> -- >>> Nitin Pawar >>> >> >> > --047d7b3438b47afff204d7e50c73 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Thankk=A0 you that solved the problem.
= But I don't understand I have set up the env variables in bash for hado= op home, why is it looking for the old location.

Cyril


On Thu, Mar 1= 4, 2013 at 9:53 AM, hadoop hive <hadoophive@gmail.com> wr= ote:
check your hadoop-env.sh file, set hadoop_ho= me path correctly.

On W= ed, Mar 13, 2013 at 7:51 PM, Cyril Bogus <cyrilbogus@gmail.com><= /span> wrote:
Hi Nitin,

As part of my configuration I have set all the environme= nt variables AND added HADOOP_PREFIX. But the problem still persist so I wi= ll just keep the extra copy in order to move forward.

And I start it from /home/hadoop. I have no idea why it is lookin= g for /home/agnik/hadoop


On Wed, Mar 13, 2013= at 10:03 AM, Nitin Pawar <nitinpawar432@gmail.com> wr= ote:
or set this variabole= =A0HADOOP_PREFIX to the directory where hadoop is installed.=A0
<= div class=3D"h5">


On Wed, Mar 13, 2013 at 7:32 P= M, Nitin Pawar <nitinpawar432@gmail.com> wrote:
Cyril,

h= ow did you install hadoop?=A0
when you start hadoop ... do you st= art it from the location where it is installed or from users home directory= ?=A0

try setting HADOOP_HOME (its deprecated but it helps to resolve issues= like where the config files are located etc)=A0


On Wed, Mar 13, 20= 13 at 7:29 PM, Cyril Bogus <cyrilbogus@gmail.com> wrote:<= br>
Thanks for the reply.<= br>
I am running on Linux. The problem IS the config file but since I co= uldn't figure out where, I made two copies of hadoop. One where it is l= ooking for it, and where other nodes will be looking for it.

In my config file, everything is I have set hadoop in /home/hadoo= p but it is looking for it in /home/owner/hadoop (for some processes at lea= st) and that was the problem.


On Tue, Mar 12, 2013 at 11:06 AM, Jean-Marc Spag= giari <jean-marc@spaggiari.org> wrote:
Hi Cyril,

Are you running in Cygwin? Or Linux? Also, you might want to share
your configuration files if you want someone to take a look.

JM

2013/3/8 Cyril Bogus <cyrilbogus@gmail.com>:
> I am trying to have a two node cluster.
> Hadoop 1.0.4
>
> the master is under the account A
> the slave is under the account B
>
> I run into an exception when I try to start the cluster.
> Here is my output
>
> A@owner-5:~$ /home/hadoop/bin/start-all.sh
> Warning: $HADOOP_HOME is deprecated.
>
> namenode running as process 3020. Stop it first.
> A@master: starting datanode, logging to
> /home/hadoop/libexec/../logs/hadoop-A-datanode-owner-5.out
> B@slave: Warning: $HADOOP_HOME is deprecated.
> B@slave:
> B@slave: starting datanode, logging to
> /home/owner/hadoop//logs/hadoop-owner-datanode-owner-7.out
> B@slave: Exception in thread "main" java.lang.NoClassDefFoun= dError:
> org/apache/hadoop/util/PlatformName
> B@slave: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> B@slave: =A0 =A0 at java.net.URLClassLoader$1.run(URLClassLoader.java:= 202)
> B@slave: =A0 =A0 at java.security.AccessController.doPrivileged(Native= Method)
> B@slave: =A0 =A0 at java.net.URLClassLoader.findClass(URLClassLoader.j= ava:190)
> B@slave: =A0 =A0 at java.lang.ClassLoader.loadClass(ClassLoader.java:3= 06)
> B@slave: =A0 =A0 at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> B@slave: =A0 =A0 at java.lang.ClassLoader.loadClass(ClassLoader.java:2= 47)
> B@slave: Could not find the main class: org.apache.hadoop.util.Platfor= mName.
> Program will exit.
> B@slave: Exception in thread "main" java.lang.NoClassDefFoun= dError:
> org/apache/hadoop/hdfs/server/datanode/DataNode
> A@master: secondarynamenode running as process 3352. Stop it first. > starting jobtracker, logging to
> /home/hadoop/libexec/../logs/hadoop-A-jobtracker-owner-5.out
> B@slave: Warning: $HADOOP_HOME is deprecated.
> B@slave:
> B@slave: starting tasktracker, logging to
> /home/owner/hadoop//logs/hadoop-owner-tasktracker-owner-7.out
> A@master: starting tasktracker, logging to
> /home/hadoop/libexec/../logs/hadoop-A-tasktracker-owner-5.out
> B@slave: Exception in thread "main" java.lang.NoClassDefFoun= dError:
> org/apache/hadoop/util/PlatformName
> B@slave: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> B@slave: =A0 =A0 at java.net.URLClassLoader$1.run(URLClassLoader.java:= 202)
> B@slave: =A0 =A0 at java.security.AccessController.doPrivileged(Native= Method)
> B@slave: =A0 =A0 at java.net.URLClassLoader.findClass(URLClassLoader.j= ava:190)
> B@slave: =A0 =A0 at java.lang.ClassLoader.loadClass(ClassLoader.java:3= 06)
> B@slave: =A0 =A0 at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> B@slave: =A0 =A0 at java.lang.ClassLoader.loadClass(ClassLoader.java:2= 47)
> B@slave: Could not find the main class: org.apache.hadoop.util.Platfor= mName.
> Program will exit.
> B@slave: Exception in thread "main" java.lang.NoClassDefFoun= dError:
> org/apache/hadoop/mapred/TaskTracker
> A@owner-5:~$
>




<= /div>--
Nitin Pawar



<= font color=3D"#888888">--
Nitin Pawar



--047d7b3438b47afff204d7e50c73--