hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Narasingu Ramesh <ramesh.narasi...@gmail.com>
Subject Re: Can't run PI example on hadoop 0.23.1
Date Tue, 11 Sep 2012 09:04:22 GMT
Hi Vinod,
            Please check whether input file location and output file
location doesnt match. please find your input file first put into HDFS and
then run MR job it is working fine.
Thanks & Regards,
Ramesh.Narasingu

On Tue, Sep 11, 2012 at 4:23 AM, Vinod Kumar Vavilapalli <
vinodkv@hortonworks.com> wrote:

>
> The AM corresponding to your MR job is failing continuously. Can you check
> the container logs for your AM ? They should be in
> ${yarn.nodemanager.log-dirs}/${application-id}/container_[0-9]*_0001_01_000001/stderr
>
> Thanks,
> +Vinod
>
> On Sep 10, 2012, at 3:19 PM, Smarty Juice wrote:
>
> Hello Champions,
>
> I am a newbie to hadoop and trying to run first hadoop example : PI , but
> I get the FIleNotFoundException, I do not know why its looking for this
> file on hdfs, anyone pls help
>
> Thank you,
>
> >bin/hadoop jar
> share/hadoop/mapreduce/hadoop-mapreduce-examples-0.23.1.jar pi -
> Dmapreduce.clientfactory.class.name=org.apache.hadoop.mapred.YarnClientFactory
> -libjars
> share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-0.23.1.jar 16 10000
> 12/09/06 19:56:10 WARN conf.Configuration:
> mapred.used.genericoptionsparser is deprecated. Instead, use
> mapreduce.client.genericoptionsparser.used
> Number of Maps = 16
> Samples per Map = 10000
> Wrote input for Map #0
> Wrote input for Map #1
> Wrote input for Map #2
> Wrote input for Map #3
> Wrote input for Map #4
> Wrote input for Map #5
> Wrote input for Map #6
> Wrote input for Map #7
> Wrote input for Map #8
> Wrote input for Map #9
> Wrote input for Map #10
> Wrote input for Map #11
> Wrote input for Map #12
> Wrote input for Map #13
> Wrote input for Map #14
> Wrote input for Map #15
> Starting Job
> 12/09/06 19:56:22 WARN conf.Configuration: fs.default.name is deprecated.
> Instead, use fs.defaultFS
> 12/09/06 19:56:22 INFO input.FileInputFormat: Total input paths to process
> : 16
> 12/09/06 19:56:23 INFO mapreduce.JobSubmitter: number of splits:16
> 12/09/06 19:56:25 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1346972652940_0002 to ResourceManager at /0.0.0.0:8040
> 12/09/06 19:56:27 INFO mapreduce.Job: The url to track the job:
> http://1.1.1.11:8088/proxy/application_1346972652940_0002/<http://10.215.12.11:8088/proxy/application_1346972652940_0002/>
> 12/09/06 19:56:27 INFO mapreduce.Job: Running job: job_1346972652940_0002
> 12/09/06 19:56:55 INFO mapreduce.Job: Job job_1346972652940_0002 running
> in uber mode : false
> 12/09/06 19:56:55 INFO mapreduce.Job: map 0% reduce 0%
> 12/09/06 19:56:56 INFO mapreduce.Job: Job job_1346972652940_0002 failed
> with state FAILED due to: Application application_1346972652940_0002 failed
> 1 times due to AM Container for appattempt_1346972652940_0002_000001 exited
> with exitCode: 1 due to:
> .Failing this attempt.. Failing the application.
> 12/09/06 19:56:56 INFO mapreduce.Job: Counters: 0
> Job Finished in 35.048 seconds
> java.io.FileNotFoundException: File does not exist:
> hdfs://localhost:9000/user/XXXXX/QuasiMonteCarlo_TMP_3_141592654/out/reduce-out
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:729)
> at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1685)
> at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1709)
> at
> org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314)
> at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:351)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:69)
> at
> org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:360)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
> at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:144)
> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:68)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:200)
>
>
>

Mime
View raw message