Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 71622104E7 for ; Sat, 1 Mar 2014 14:11:44 +0000 (UTC) Received: (qmail 68936 invoked by uid 500); 1 Mar 2014 14:11:34 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 68338 invoked by uid 500); 1 Mar 2014 14:11:29 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 68323 invoked by uid 99); 1 Mar 2014 14:11:28 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 01 Mar 2014 14:11:28 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of senthil.se@gmail.com designates 209.85.223.169 as permitted sender) Received: from [209.85.223.169] (HELO mail-ie0-f169.google.com) (209.85.223.169) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 01 Mar 2014 14:11:24 +0000 Received: by mail-ie0-f169.google.com with SMTP id at1so4625518iec.28 for ; Sat, 01 Mar 2014 06:11:04 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=z58AFC860R5rLXv59AbDaPiakE9LSztFBGeBe6OTCWw=; b=sgf0t6qaeOKNOfdr4dzeUmP8M5dbeLIabuFaqhxy3/7dA6J6chv4+rXGgeC4/wfZ2X 4anYheOpe1RF7o5mTLerfY0h+K64p3qBv8mKSGHiFfW9xQraeuxwcHFsf02qxGGxwvY0 sTSsxxbZ5QGxoPPDF6rNt3E0WzxN5bMLlcGsSM3rBBUPSpINpe/4DU3uB7SZoFm2wkj0 1q55jAs6l0a/o1Vx9bVsvJmIVbLYrnoUYqcT9Ru4r1KfpAmIvL1AKfBUtexdsu+3nWsM 2wdfM2zaTcCdcA5NWS6gRk9GOoQsnDWchnZIkGoGqrH9h8mVi45QDzX6jSk5No/Q2zV8 DxGw== MIME-Version: 1.0 X-Received: by 10.50.93.106 with SMTP id ct10mr10689784igb.21.1393683063988; Sat, 01 Mar 2014 06:11:03 -0800 (PST) Received: by 10.42.233.17 with HTTP; Sat, 1 Mar 2014 06:11:03 -0800 (PST) Date: Sat, 1 Mar 2014 14:11:03 +0000 Message-ID: Subject: Problem in Submitting a Map-Reduce Job to Remote Hadoop Cluster From: Senthil Sekar To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b2e121ff5673404f38c1f16 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b2e121ff5673404f38c1f16 Content-Type: text/plain; charset=ISO-8859-1 Hi , I have a remote server (Cent - OS - 6.3 ) with CDH-4.0.1 installed. I do have another Windows-7.0 machine from which iam trying to submit simple WordCount Map reduce job (i have included the HADOOP - 2.0.0 lib Jars in my Eclipse environment) I am getting the below Exception when i try to run it from ECLIPSE of my Windows7 Machine //------------------- Exception in thread "main" java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses. at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:121) at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:83) at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:76) at org.apache.hadoop.mapred.JobClient.init(JobClient.java:487) at org.apache.hadoop.mapred.JobClient.(JobClient.java:466) at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:879) at com.pss.WordCount.main(WordCount.java:79) //--------------------- Please find the code below //----------------------------------------------------- public class WordCount { public static class Map extends MapReduceBase implements Mapper { private final static IntWritable one = new IntWritable(1); private Text word = new Text(); @Override public void map(LongWritable key, Text value, OutputCollector output, Reporter reporter) throws IOException { String line=value.toString(); StringTokenizer tokenizer = new StringTokenizer(line); while (tokenizer.hasMoreTokens()) { word.set(tokenizer.nextToken()); output.collect(word, one); } } } public static class Reduce extends MapReduceBase implements Reducer { @Override public void reduce(Text key, Iterator values, OutputCollector output, Reporter reporter) throws IOException { // TODO Auto-generated method stub int sum=0; while(values.hasNext()) { sum+=values.next().get(); } output.collect(key, new IntWritable(sum)); } } public static void main(String[] args) throws IOException { Configuration config= new Configuration(); config.set("fs.default.name","hdfs://xyz-hostname:9000"); config.set("mapred.job.tracker","xyz-hostname:9001"); JobConf conf= new JobConf(config); conf.setJarByClass(WordCount.class); //conf.setJar(jar); conf.setJobName("WordCount"); conf.setOutputKeyClass(Text.class); conf.setOutputValueClass(IntWritable.class); conf.setMapperClass(Map.class); //conf.setCombinerClass(Reduce.class); conf.setReducerClass(Reduce.class); conf.setInputFormat(TextInputFormat.class); conf.setOutputFormat(TextOutputFormat.class); FileInputFormat.setInputPaths(conf, new Path(args[0])); FileOutputFormat.setOutputPath(conf, new Path(args[1])); JobClient.runJob(conf); } } //--------------------------------------------------------------------------------------------- Please help me to resolve this issue. Regards, Senthil --047d7b2e121ff5673404f38c1f16 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi ,

=A0I have a remote server (Cent - = OS - 6.3 ) with CDH-4.0.1 installed.

=A0I do have = another Windows-7.0 machine from which iam trying to submit simple WordCoun= t Map reduce job (i have included the HADOOP - 2.0.0 =A0lib Jars in my Ecli= pse environment)

I am getting the below Exception when i try to run it f= rom ECLIPSE of my Windows7 Machine
//-------------------
Exception in thread "main" java.io.IOException: Cannot ini= tialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond serve= r addresses.
at org.apache.hadoo= p.mapreduce.Cluster.initialize(Cluster.java:121)
at org.apache.hadoop.mapreduce.Cluster.&= lt;init>(Cluster.java:83)
at org.apache.hadoo= p.mapreduce.Cluster.<init>(Cluster.java:76)
at org.apache.hadoop.mapred.JobClient.i= nit(JobClient.java:487)
at org.apache.hadoo= p.mapred.JobClient.<init>(JobClient.java:466)
at org.apache.hadoop.mapred.JobClie= nt.runJob(JobClient.java:879)
at com.pss.WordCoun= t.main(WordCount.java:79)

//----------------= -----

Please find the code below

//-----------------------------------------------------
public class WordCount {
p= ublic static class Map extends MapReduceBase implements Mapper<LongWrita= ble, Text, Text, IntWritable>
{
private final static IntWritab= le one =3D new IntWritable(1);
private Text word =3D new Text();

@Ov= erride
public= void map(LongWritable key, Text value,
OutputCollector<Text, IntWritable> outpu= t, Reporter reporter)
throws IOExcepti= on {
String = line=3Dvalue.toString();
StringTokenizer tokenizer =3D new StringTokenizer(line);
while (tokenizer.= hasMoreTokens())
{
word= .set(tokenizer.nextToken());
output.collect(w= ord, one);
}=
= }
}
=A0 =A0 public static class Reduc= e extends MapReduceBase implements Reducer<Text, IntWritable, Text, IntW= ritable>
=A0 =A0 {

@Override
public void reduce(Text key, Iterator<IntWritable> valu= es,
OutputCollector&= lt;Text, IntWritable> output, Reporter reporter)
throws IOException {
<= span class=3D"" style=3D"white-space:pre"> // TODO Auto-generated = method stub
int sum=3D0;
while(values.hasNext())
{
sum+=3Dvalues.ne= xt().get();
= }
output.col= lect(key, new IntWritable(sum));
}
=A0 = =A0
=A0 =A0 }=
=A0 =A0=A0
=A0 =A0 public static void main(String[] ar= gs) throws IOException {
=A0 =A0 Configurati= on config=3D new Configuration();
=A0 =A0 config.set("fs.default.name","hdfs://xyz-hostname:9000");
=A0 =A0 config.set(= "mapred.job.tracker","xyz-hostname:9001");
= =A0 =A0
=A0 = =A0
=A0 =A0 =A0 =A0 JobConf conf=3D new JobConf(config);
=A0 =A0=
=A0 =A0 conf.setJarByClass(WordCount= .class);
=A0 =A0 =A0 =A0 //conf.setJar(jar);
=A0 =A0
= =A0 =A0 conf.setJobName(= "WordCount");
=A0 =A0 conf.setOutputKeyClass(Text.class);
=A0 =A0 conf.setOut= putValueClass(IntWritable.class);
=A0 =A0
=A0 =A0 conf.setMapperClass(Map.class);
=A0 =A0 //conf.setC= ombinerClass(Reduce.class);
=A0 =A0 conf.setReducerClass(Reduce.class);
=A0 =A0 conf.setInp= utFormat(TextInputFormat.class);
=A0 =A0 conf.setOutputFormat(TextOutputFormat.class);<= /div>
=A0 =A0
=A0 =A0
= =A0 =A0 =A0FileInputFor= mat.setInputPaths(conf, new Path(args[0]));
=A0 =A0 =A0FileOutputFormat.setOutputPath(= conf, new Path(args[1]));
=A0 =A0 =A0
=A0 =A0 =A0JobClient= .runJob(conf);

=A0 =A0 =A0
}
}
=

//-----------------------------------------------= ----------------------------------------------

Please help me to resolve this issue.

R= egards,

Senthil

=A0=A0
--047d7b2e121ff5673404f38c1f16--