hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arijit Mukherjee <ariji...@gmail.com>
Subject Cannot execute the start-mapred script
Date Mon, 16 Feb 2009 09:27:11 GMT
Hi All

I'm trying to create a tiny 2-node cluster (both on linux FC7) with Hadoop
0.19.0 - previously, I was able to install and run hadoop on a single node.
Now I'm trying it on 2 nodes - my idea was to put the name node and the job
tracker on separate nodes, and initially use these two as the data nodes. So
basically, the "master" and "slave" files - both have the names of these two
nodes. When I start the dfs from the name node, it seems to work. But when I
try to run the start-mapred.sh script, I get the following exception -

blueberry: Exception in thread "main" java.lang.NoClassDefFoundError:
blueberry: Caused by: java.lang.ClassNotFoundException:
blueberry:     at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
blueberry:     at java.security.AccessController.doPrivileged(Native Method)
blueberry:     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
blueberry:     at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
blueberry:     at
blueberry:     at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
blueberry:     at
blueberry: Could not find the main class:
Could_not_reserve_enough_space_for_the_card_marking_array.  Program will

Is it related to the heap space I allocated in the hadoop-env.sh? Or is it
something else?


"And when the night is cloudy,
There is still a light that shines on me,
Shine on until tomorrow, let it be."

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message