hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ganesh Iyer <ganesh.vignesw...@gmail.com>
Subject Mapreduce example: Gutenberg
Date Tue, 08 Jun 2010 01:22:25 GMT
hi,
I am new to hadoop.
I have executed
[hadoop@master hadoop]$ bin /start-dfs.sh
starting namenode, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-namenode-master.out
master: starting datanode, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-datanode-master.out
slave1: starting datanode, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-datanode-slave1.out
slave2: starting datanode, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-datanode-slave2.out
master: starting secondarynamenode, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-secondarynamenode-master.out

[hadoop@master hadoop]$ bin /start-mapred.sh
starting jobtracker, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-jobtracker-master.out
master: starting tasktracker, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-tasktracker-master.out
slave2: starting tasktracker, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-tasktracker-slave2.out
slave1: starting tasktracker, logging to /hadoop/hadoop/bin/../logs
/hadoop-hadoop-tasktracker-slave1.out



I ve created a gutenberg folder and did all the commands before(when
clustering a single node)

cd /hadoop
mkdir gutenberg
cd gutenberg
wget http://www.gutenberg.org/files/20417/20417.txt<http://www.linkedin.com/redirect?url=http%3A%2F%2Fwww%2Egutenberg%2Eorg%2Ffiles%2F20417%2F20417%2Etxt&urlhash=8ze3>

wget http://www.gutenberg.org/dirs/etext04/7ldvc10.txt<http://www.linkedin.com/redirect?url=http%3A%2F%2Fwww%2Egutenberg%2Eorg%2Fdirs%2Fetext04%2F7ldvc10%2Etxt&urlhash=8iZ7>

wget http://www.gutenberg.org/files/4300/4300.txt<http://www.linkedin.com/redirect?url=http%3A%2F%2Fwww%2Egutenberg%2Eorg%2Ffiles%2F4300%2F4300%2Etxt&urlhash=lSzx>


cd /hadoop/hadoop
bin/hadoop dfs -ls
bin/hadoop dfs -copyFromLocal /hadoop/gutenberg gutenberg
bin/hadoop dfs -ls gutenberg
bin/hadoop dfs -rmr gutenberg-output



Then when i run the following command
bin/hadoop jar hadoop-0.20.2-examples.jar wordcount gutenberg
gutenberg-output

There is nothing displayed..
I ve configgured 2 slaves
I ve given the IP address,hostnames in both...

Why the gutenberg mapreduce can't functions.........
PLZ HELP

-- 
regards,
Ganesh Neelakanta Iyer,
Research Scholar,
Department of Electrical and Computer Engineering,
Computer Networks and Distributed Systems Laboratory,
National University of Singapore,
Singapore
http://cnds.ece.nus.edu.sg/ganesh/

Hare Krishna!!!

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message