hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <Shreya....@cognizant.com>
Subject RE: Issues starting TaskTracker
Date Wed, 14 Sep 2011 04:03:47 GMT

Hi

 

I downloaded cloudera VM
(https://ccp.cloudera.com/display/SUPPORT/Cloudera's+Hadoop+Demo+VM#Clou
dera%27sHadoopDemoVM-DemoVMWareImage)  for VMware and vmware player.

The VM is 64 bit but my OS is 32 bit.

What can be the solution?

 

 

Regards,

Shreya

 

From: Bejoy KS [mailto:bejoy.hadoop@gmail.com] 
Sent: Tuesday, September 13, 2011 3:08 PM
To: mapreduce-user@hadoop.apache.org
Subject: Re: Issues starting TaskTracker

 

Shreya
       To add on. From cloudera website you would get images for
different VMs like VM Ware, Virtual Box etc. Choose the appropriate one
for your use as per your availabe software. 
      To your question, it is definitely possible to run map reduce
progarms from Cloudera VM and in fact it is the most comfortable way(at
least for me) to test my map reduce code. When you are on cloudera VM to
test your plain map reduce code in fact you don't even need to pack your
source code into jar, deploy the same and then execute it.(now you might
be doing development in windows and deployment and test in linux) To
test your code just follow the sequence of steps

*	Download and install eclipse on the VM (any IDE you are on)
*	Create your project with Mapper, Reducer and Driver classes (may
be a single file also would be fine as per your convenience)
*	Click on the class that contains your main method, give run as
java application
*	It'd do the job for you.

The few things you need to keep in mind are

*	Use very minimal test data. Larger data volumes would lead to
very slow execution due to limited resources.(just use VM to test the
logic)
*	Normally in our driver class we get the input and output
directory from command line when we deploy as jars and run the same, but
when you run from eclipse just alter the lines of code specifying input
and output as

       For input and output directory in hdfs
            FileInputFormat.addInputPath(job, new
Path("hdfs://localhost/<full path in hdfs>"));
            FileOutputFormat.setOutputPath(job, new
Path(hdfs://localhost/<full path in hdfs>"));

       For input and output directory in lfs
            FileInputFormat.addInputPath(job, new Path("<full path in
lfs>"));
            FileOutputFormat.setOutputPath(job, new Path("<full path in
lfs>"));

Hope it helps

Regards
Bejoy.K.S



On Tue, Sep 13, 2011 at 2:40 PM, Bejoy KS <bejoy.hadoop@gmail.com>
wrote:

Hi Shreya
         You can copy files from windows to the linux on VM using any
ftp tools like filezilla.
Take a terminal on your linix, type ifconfig , the value given under
'inet addr:'  would be your IP address.
Use this IP address and default port (22) to connect to liux image from
Windows through filezilla. The Cloudera VM has the user name and
password as 'cloudera' itself.

Hope It helps

Regards
Bejoy.KS

 

On Tue, Sep 13, 2011 at 2:18 PM, <Shreya.Pal@cognizant.com> wrote:


Hi Harsh,

Version of Hadoop - hadoop-0.20.203.0
How do I make the process owner same as directory owner
Directory owner is - Titun


Regards
Shreya

-----Original Message-----
From: Harsh J [mailto:harsh@cloudera.com]
Sent: Monday, September 12, 2011 10:50 PM
To: mapreduce-user@hadoop.apache.org
Subject: Re: Issues starting TaskTracker

Shreya,

> I was getting the message owner SYSTEM when I was using default
> I was getting the message - running as TITUN, but the same error

What user are you actually launching the TaskTracker as? The directory
owner (user) must be == process owner (user) of the TT, and things
should be fine! Can you confirm that this isn't the case? What version
of Hadoop are you using?

P.s. Am really beginning to dislike MS Exchange or your sysadmin's
mailer settings here :-)

On Mon, Sep 12, 2011 at 9:23 PM,  <Shreya.Pal@cognizant.com> wrote:
> This e-mail and any files transmitted with it are for the sole use of
the intended recipient(s) and may contain confidential and privileged
information.
> If you are not the intended recipient, please contact the sender by
reply e-mail and destroy all copies of the original message.
> Any unauthorised review, use, disclosure, dissemination, forwarding,
printing or copying of this email or any action taken in reliance on
this e-mail is strictly
> prohibited and may be unlawful.



--
Harsh J

This e-mail and any files transmitted with it are for the sole use of
the intended recipient(s) and may contain confidential and privileged
information.
If you are not the intended recipient, please contact the sender by
reply e-mail and destroy all copies of the original message.
Any unauthorised review, use, disclosure, dissemination, forwarding,
printing or copying of this email or any action taken in reliance on
this e-mail is strictly
prohibited and may be unlawful.

 

 



This e-mail and any files transmitted with it are for the sole use of the intended recipient(s)
and may contain confidential and privileged information.
If you are not the intended recipient, please contact the sender by reply e-mail and destroy
all copies of the original message. 
Any unauthorised review, use, disclosure, dissemination, forwarding, printing or copying of
this email or any action taken in reliance on this e-mail is strictly 
prohibited and may be unlawful.
Mime
View raw message