hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Joey Echeverria <j...@cloudera.com>
Subject Re: Input path does not exist ERROR 2118:
Date Tue, 20 Sep 2011 16:33:45 GMT
What is the output of the following:

hadoop fs -ls hdfs://10.0.0.61/user/kiranprasad.g/pig-0.8.1/tutorial/data/excite-small.log

-Joey

On Tue, Sep 20, 2011 at 1:44 AM, kiranprasad
<kiranprasad.g@imimobile.com> wrote:
> Hi
>
> When I have run the same from local mode it is working fine and I got the
> result, but on Hadoop file system  when trying to file I am getting the
> below exception.
>
> grunt> A= LOAD '/data/excite-small.log' AS (user, time, query);
> grunt> lmt= LIMIT A 3;
> grunt> DUMP lmt;
> 2011-09-20 15:44:30,417 [main] INFO
> org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script:
> LIMIT
> 2011-09-20 15:44:30,418 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
> pig.usenewlogicalplan is set to true. New logical plan will be used.
> 2011-09-20 15:44:30,456 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - (Name: lmt:
> Store(hdfs://10.0.0.61/tmp/temp-1789584520/tmp-2069895971:org.apache.pig.impl.io.InterStorage)
> - scope-11 Operator Key: scope-11)
> 2011-09-20 15:44:30,457 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler -
> File concatenation threshold: 100 optimistic? false
> 2011-09-20 15:44:30,465 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
> - MR plan size before optimization: 1
> 2011-09-20 15:44:30,465 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
> - MR plan size after optimization: 1
> 2011-09-20 15:44:30,473 [main] INFO
> org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added to
> the job
> 2011-09-20 15:44:30,474 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
> - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
> 2011-09-20 15:44:32,453 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
> - Setting up single store job
> 2011-09-20 15:44:32,491 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 1 map-reduce job(s) waiting for submission.
> 2011-09-20 15:44:33,118 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 0% complete
> 2011-09-20 15:44:33,119 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - job null has failed! Stop running all dependent jobs
> 2011-09-20 15:44:33,121 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 100% complete
> 2011-09-20 15:44:33,125 [main] ERROR org.apache.pig.tools.pigstats.PigStats
> - ERROR 2997: Unable to recreate exception from backend error:
> org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Input path
> does not exist:
> hdfs://10.0.0.61/user/kiranprasad.g/pig-0.8.1/tutorial/data/excite-small.log
> 2011-09-20 15:44:33,126 [main] ERROR
> org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
> 2011-09-20 15:44:33,127 [main] INFO  org.apache.pig.tools.pigstats.PigStats
> - Script Statistics:
>
> HadoopVersion   PigVersion      UserId  StartedAt       FinishedAt
> Features
> 0.20.2  0.8.1   kiranprasad.g   2011-09-20 15:44:30     2011-09-20
> 15:44:33     LIMIT
>
> Failed!
>
> Failed Jobs:
> JobId   Alias   Feature Message Outputs
> N/A     A,lmt           Message:
> org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Input path
> does not exist:
> hdfs://10.0.0.61/user/kiranprasad.g/pig-0.8.1/tutorial/data/excite-small.log
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:280)
>         at
> org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:885)
>         at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
>         at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:730)
>         at org.apache.hadoop.mapred.jobcontrol.Job.submit(Job.java:378)
>         at
> org.apache.hadoop.mapred.jobcontrol.JobControl.startReadyJobs(JobControl.java:247)
>         at
> org.apache.hadoop.mapred.jobcontrol.JobControl.run(JobControl.java:279)
>         at java.lang.Thread.run(Thread.java:619)
> Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException:
> Input path does not exist:
> hdfs://10.0.0.61/user/kiranprasad.g/pig-0.8.1/tutorial/data/excite-small.log
>         at
> org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:224)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigTextInputFormat.listStatus(PigTextInputFormat.java:36)
>         at
> org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:241)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:268)
>         ... 7 more
>         hdfs://10.0.0.61/tmp/temp-1789584520/tmp-2069895971,
>
> Input(s):
> Failed to read data from
> "hdfs://10.0.0.61/user/kiranprasad.g/pig-0.8.1/tutorial/data/excite-small.log"
>
> Output(s):
> Failed to produce result in
> "hdfs://10.0.0.61/tmp/temp-1789584520/tmp-2069895971"
>
> Counters:
> Total records written : 0
> Total bytes written : 0
> Spillable Memory Manager spill count : 0
> Total bags proactively spilled: 0
> Total records proactively spilled: 0
>
> Job DAG:
> null
>
>
> 2011-09-20 15:44:33,129 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - Failed!
> 2011-09-20 15:44:33,132 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> ERROR 1066: Unable to open iterator for alias lmt
> Details at logfile: /home/kiranprasad.g/pig-0.8.1/pig_1316511574458.log
>
>
> Regards
> Kiran.G



-- 
Joseph Echeverria
Cloudera, Inc.
443.305.9434

Mime
View raw message