Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 19093 invoked from network); 6 Feb 2010 05:26:33 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 6 Feb 2010 05:26:33 -0000 Received: (qmail 20480 invoked by uid 500); 6 Feb 2010 05:26:30 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 20353 invoked by uid 500); 6 Feb 2010 05:26:30 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 20343 invoked by uid 99); 6 Feb 2010 05:26:29 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 06 Feb 2010 05:26:29 +0000 X-ASF-Spam-Status: No, hits=-0.0 required=10.0 tests=SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of xcolwell@gmail.com designates 209.85.212.48 as permitted sender) Received: from [209.85.212.48] (HELO mail-vw0-f48.google.com) (209.85.212.48) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 06 Feb 2010 05:26:21 +0000 Received: by vws18 with SMTP id 18so34006vws.35 for ; Fri, 05 Feb 2010 21:26:00 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:received:message-id:date:from :user-agent:mime-version:to:subject:references:in-reply-to :content-type:content-transfer-encoding; bh=4MtzbO2RrgTcbikRU+yqXCBAUvL/kWKUgZAwHuMWt4Y=; b=tLoS4eBqcpRSRF4nab9MyaZTitZ7gJ9E1flIrDY4D5+zS3lKNne2UCa/lB3DvT29rN 2iu3+6ok88F9NsmiWtb9r7kgpEST7oiu7HPct0toO4VcFVjQoP93KmemYbFaicJcso9e 1PGRlrwDrz3OQwbvbDod4PvkyqidGu7tKh2BQ= DomainKey-Signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=message-id:date:from:user-agent:mime-version:to:subject:references :in-reply-to:content-type:content-transfer-encoding; b=MNe2fk/5aPQhR54FYJO60PGKbKyEsKGvhxANiPn8ErhKqKRU1yxxcHVF93Ufl0gHth JZxbwA1pQbPH4dH/zOV5ECMPxumICS2hDSmiPQQjdqVv2GJN2+pFi7uQmyLwFAVEscNj +ruOn4kMqdAH0QjXiP7sWanxXZmCthbEppJyY= Received: by 10.220.123.204 with SMTP id q12mr4325927vcr.97.1265433959905; Fri, 05 Feb 2010 21:25:59 -0800 (PST) Received: from ?10.0.1.2? (user-160v49j.cable.mindspring.com [76.15.145.51]) by mx.google.com with ESMTPS id 25sm18775112vws.20.2010.02.05.21.25.58 (version=TLSv1/SSLv3 cipher=RC4-MD5); Fri, 05 Feb 2010 21:25:59 -0800 (PST) Message-ID: <4B6CFD68.2070102@gmail.com> Date: Sat, 06 Feb 2010 00:26:00 -0500 From: brien colwell User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1.7) Gecko/20100111 Thunderbird/3.0.1 MIME-Version: 1.0 To: common-user@hadoop.apache.org Subject: Re: Not able to compile '.java' files References: <870bd4eb1002051458r49eb2920v9d6753510c85e207@mail.gmail.com> In-Reply-To: <870bd4eb1002051458r49eb2920v9d6753510c85e207@mail.gmail.com> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit To get a feel for Hadoop, I'd recommend using Eclipse and using a single node to start. If you add all the Hadoop JARs to your Eclipse build path, I think there are five, then Eclipse will manage the classpath for you. The following config settings will set up Hadoop to use the local file system and run your MapRed job in a single JVM. In this way you can set also breakpoints and take it step by step. Configuration baseConf = new Configuration(); baseConf.set("mapred.job.tracker", "local"); baseConf.set("fs.default.name", "file:///"); baseConf.set("mapred.system.dir", String.format("%s/hadoop/mapred/system", LOCAL_TEMP_DIR)); baseConf.set("mapred.local.dir", String.format("%s/hadoop/mapred/data", LOCAL_TEMP_DIR)); baseConf.set("dfs.name.dir", String.format("%s/hadoop/namespace", LOCAL_TEMP_DIR)); baseConf.set("dfs.data.dir", String.format("%s/hadoop/data", LOCAL_TEMP_DIR)); Then use this configuration when setting up a JobConf. hope that helps, Brien On 2/5/2010 5:58 PM, Prateek Jindal wrote: > Hi everyone, > > I am new to mapReduce. I am trying to run a very basic mapReduce > application. I encountered the following problem. Can someone help me about > it: > > 1) I have 3 files, namely MaxTemperature.java, MaxTemperatureMapper.java, > MaxTemperatureReducer.java. Now, I have to compile them to get the '.class' > files which would be used by 'hadoop' command. I tried the following: > > 'javac -cp .:/hadoop/lib MaxTemperatureMapper.java' > > But it gives me the error that it doesn't recognize the packages ' > org.apache.hadoop.io', 'org.apache.hadoop.mapred' and so on. > > Can someone suggest something about that? > > 2) Also, do we have to make the '.class' files by ourselves necessarily. Or > is it somehow possible that hadoop will make .class files by itself (from > the .java source files)? > > Thanks, > Prateek. > >