hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sky USC <sky...@hotmail.com>
Subject Reading properties file from command line + passing parameters from main to mapper/reducer?
Date Fri, 06 Apr 2012 22:24:42 GMT

Dear friends I am new to hadoop on aws. I am using AWS Elastic mapreduce. I am trying to convert
an old java program to elastic mapreduce. I would appreciate your help. My question is about:
1. How do I pass a "Properties" file which i used to pass via command line paramters - to
elastic mapreduce?
2. how do I pass parameters from this properties file from main() to the Mapper and Reducer?
1. properties file: 
My program used to read a properties file as follows:
org.apache.commons.configuration.Configuration config = new PropertiesConfiguration("my_app.properties");
How do I read this in Amazon EMR?
I tried launching the program with :
jar file: s3n://my.bucket.name/myjar-job.jar
args: s3n://my.bucket.name/my_app.properties I get an stderr of:
org.apache.commons.configuration.ConfigurationException: Cannot locate configuration source
s3n://my.bucket.name/my_app.properties In order to debug, I tried to do the following:
String c = FileUtils.readFileToString(new File(remainingArgs[0]));
I got an exception:
Exception in thread "main" java.io.FileNotFoundException: File 's3n:/my.bucket.name/my_app.propertiess'
does not exist
Notice that the file name - even though I entered s3n:// is shown as s3n:/ - could the //
be getting dropped?  2. Passing the properties file from my main() to Mapper and Reducer.
How do I do that? Or pass other generic parameters?
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message