hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deepak Diwakar <ddeepa...@gmail.com>
Subject Command line config arguments
Date Mon, 09 Aug 2010 18:09:27 GMT
Hey friends,

I am in a doubt. Suppose i want to pass program specific config parameter
through command line and after reading it  setting up to the desired local
variable. For example, suppose I am  passing threshold value for wordcount
example to tab only those words who crosses the threshold. I declare a
wordcound static member called "threshold" which is set once we read the
command line config value in run().

When i read value of the threshold in mapper in standalone mode, it is well
set. But when I run the same job in DFS mode, and  see value of the
threshold in mapper, is not set. In fact it is taking the default value
which is assigned at the time of declaration.

Currently whenever i have to do such custom program-related config
assignments, I use a sub-program to store this info into a place called
metastore and then let the slaves(who all are running map-reduce)  to access
and set the value of variables accordingly.

Could somebody  point me out any other way to do so?

Appreciate help.


Thanks & regards,
- Deepak Diwakar,

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message