hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jayalakshmi sandhya <saisandhya...@gmail.com>
Subject Re: prob while setting the nodes
Date Sat, 16 Jan 2010 04:08:43 GMT
Thank u.. Seems to be nice one. And ok. Will read the tutorial and then
proceed with my work.

regards-
Sandhya

On Sat, Jan 16, 2010 at 12:04 AM, vikas <pvssvikas@gmail.com> wrote:

> Hi,
>
> To start with Hadoop can you follow the below links .. where a working VM
> is
> being shared by Yahoo team.
>
> http://developer.yahoo.com/hadoop/tutorial/index.html
> http://developer.yahoo.com/hadoop/tutorial/module3.html
>
> Once you are comfortable with hadoop and after gaining a basic confidence
> with hadoop you may compare the working environment with your setup to
> resolve your issue.
>
> Hope this helps,
>
> Thanks,
> -Vikas.
>
> On Fri, Jan 15, 2010 at 4:58 PM, jayalakshmi sandhya <
> saisandhyacse@gmail.com> wrote:
>
> > Oh!!Ya sure.. I ll paste the text here..
> >
> >
> >
> > When I type these cmds in terminal,
> >
> > sandhya@sandhya-laptop:/usr/local/hadoop$ bin/hadoop jar
> > hadoop-*-examples.jar grep input output 'dfs[a-z.]+'
> >
> > (or)
> >
> > sandhya@sandhya-laptop:/usr/local/hadoop$ bin/hadoop namenode –format
> >
> > I get  this error
> >
> > Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad
> > version number in .class file
> >
> >                at java.lang.ClassLoader.defineClass1(Native Method)
> >
> >                at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> >
> >                at
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> >
> >                at
> > java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> >
> >                at
> > java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> >
> >                at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> >
> >                at java.security.AccessController.doPrivileged(Native
> > Method)
> >
> >                at
> > java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >
> >                at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >
> >                at
> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> >
> >                at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> >
> >                at
> > java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> >
> > Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad
> > version number in .class file
> >
> >                at java.lang.ClassLoader.defineClass1(Native Method)
> >
> >                at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> >
> >                at
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> >
> >                at
> > java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> >
> >                at
> > java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> >
> >                at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> >
> >                at java.security.AccessController.doPrivileged(Native
> > Method)
> >
> >                at
> > java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >
> >                at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >
> >                at
> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> >
> >                at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> >
> >                at
> > java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> >
> > sandhya@sandhya-laptop:/usr/local/hadoop$
> >
> >
> >
> >
> >
> > *Actually there was a step after installing hadoop like-*
> >
> >
> >
> > Unpack the downloaded Hadoop distribution. In the distribution, edit the
> > file conf/hadoop-env.sh to define at least JAVA_HOME to be the root of
> your
> > Java installation.
> >
> >
> >
> > So this is my conf/hadoop-env.sh ,
> >
> >
> >
> > # Set Hadoop-specific environment variables here.
> >
> >
> >
> > # The only required environment variable is JAVA_HOME.  All others are
> >
> > # optional.  When running a distributed configuration it is best to
> >
> > # set JAVA_HOME in this file, so that it is correctly defined on
> >
> > # remote nodes.
> >
> >
> >
> > # The java implementation to use.  Required.
> >
> > *export JAVA_HOME=/home/sandhya/jdk   # I CHANGED HERE .. MY JAVA
> > INSTALLATION IS IN THIS PATH*
> >
> > # Extra Java CLASSPATH elements.  Optional.
> >
> > # export HADOOP_CLASSPATH=
> >
> >
> >
> > # The maximum amount of heap to use, in MB. Default is 1000.
> >
> > export HADOOP_HEAPSIZE=2000 # I CHANGED HERE
> >
> >
> >
> > # Extra Java runtime options.  Empty by default.
> >
> > export HADOOP_OPTS=-server
> >
> >
> >
> > # Command specific options appended to HADOOP_OPTS when specified
> >
> > export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
> > $HADOOP_NAMENODE_OPTS"
> >
> > export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
> > $HADOOP_SECONDARYNAMENODE_OPTS"
> >
> > export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
> > $HADOOP_DATANODE_OPTS"
> >
> > export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
> > $HADOOP_BALANCER_OPTS"
> >
> > export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
> > $HADOOP_JOBTRACKER_OPTS"
> >
> > # export HADOOP_TASKTRACKER_OPTS=
> >
> > # The following applies to multiple commands (fs, dfs, fsck, distcp etc)
> >
> > # export HADOOP_CLIENT_OPTS
> >
> >
> >
> > # Extra ssh options.  Empty by default.
> >
> > # export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"
> >
> >
> >
> > # Where log files are stored.  $HADOOP_HOME/logs by default.
> >
> > # export HADOOP_LOG_DIR=${HADOOP_HOME}/logs
> >
> >
> >
> > # File naming remote slave hosts.  $HADOOP_HOME/conf/slaves by default.
> >
> > # export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves
> >
> >
> >
> > # host:path where hadoop code should be rsync'd from.  Unset by default.
> >
> > # export HADOOP_MASTER=master:/home/$USER/src/hadoop
> >
> >
> >
> > # Seconds to sleep between slave commands.  Unset by default.  This
> >
> > # can be useful in large clusters, where, e.g., slave rsyncs can
> >
> > # otherwise arrive faster than the master can service them.
> >
> > # export HADOOP_SLAVE_SLEEP=0.1
> >
> >
> >
> > # The directory where pid files are stored. /tmp by default.
> >
> > # export HADOOP_PID_DIR=/var/hadoop/pids
> >
> >
> >
> > # A string representing this instance of hadoop. $USER by default.
> >
> > # export HADOOP_IDENT_STRING=$USER
> >
> >
> >
> > # The scheduling priority for daemon processes.  See 'man nice'.
> >
> > # export HADOOP_NICENESS=10
> >
> >
> >
> > When I googled, I got to know this-
> >
> >
> >
> > *That's because you're using classes compiled with different versions of
> > Java.**
> >
> > Typically if you use a j1.5 compiled class in a J1.4 JVM, it's not going
> to
> > work.*
> >
> > * *
> >
> > I do not know how I shud verify the above statements. But,Actually I had
> > another  installation of java , now I removed that  version.
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > On Fri, Jan 15, 2010 at 3:25 PM, vikas <pvssvikas@gmail.com> wrote:
> >
> > > But the attachment is missing ... it is better you paste your text in
> the
> > > mail itself.
> > >
> > > -Vikas.
> > >
> > >
> > >
> > > On Fri, Jan 15, 2010 at 2:32 PM, jayalakshmi sandhya <
> > > saisandhyacse@gmail.com> wrote:
> > >
> > > > Hi.. I downloaded and installed hadoop. When i was setting up the
> > nodes,
> > > > following the instructions given in Apache hadoop's- Quickstart, i
> got
> > > one
> > > > problem. Am not able to proceed futher. Pl help me out..
> > > >
> > > > I have explained the prob in detail, in the attached document.
> > > >
> > > > regards-
> > > > Sandhya
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message