hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pavan Sudheendra <pavan0...@gmail.com>
Subject Re: Cannot write the output of the reducer to a sequence file
Date Tue, 30 Jul 2013 05:29:38 GMT
Hi,
This is the output message whjich i got when it failed:

WARN hdfs.DFSClient: DataStreamer Exception:
org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease
on /sequenceOutput/_temporary/_attempt_local_0001_r_000000_0/part-r-00000
File does not exist. Holder DFSClient_NONMAPREDUCE_-79044441_1 does
not have any open files.
13/07/29 17:04:20 WARN hdfs.DFSClient: Error Recovery for block null
bad datanode[0] nodes == null
13/07/29 17:04:20 WARN hdfs.DFSClient: Could not get block locations.
Source file "/sequenceOutput/_temporary/_attempt_local_0001_r_000000_0/part-r-00000"
- Aborting...
13/07/29 17:04:20 ERROR hdfs.DFSClient: Failed to close file
/sequenceOutput/_temporary/_attempt_local_0001_r_000000_0/part-r-00000


On Mon, Jul 29, 2013 at 9:34 PM, Harsh J <harsh@cloudera.com> wrote:
> Hi,
>
> Can you explain the problem you actually face in trying to run the
> above setup? Do you also set your reducer output types?
>
> On Mon, Jul 29, 2013 at 4:48 PM, Pavan Sudheendra <pavan0591@gmail.com> wrote:
>> I have a Map function and a Reduce funtion outputting kep-value pairs
>> of class Text and IntWritable.. This is just the gist of the Map part
>> in the Main function :
>>
>> TableMapReduceUtil.initTableMapperJob(
>>   tablename,        // input HBase table name
>>   scan,             // Scan instance to control CF and attribute selection
>>   AnalyzeMapper.class,   // mapper
>>   Text.class,             // mapper output key
>>   IntWritable.class,             // mapper output value
>>   job);
>>
>> And here's my Reducer part in the Main function which writes the output to HDFS
>>
>> job.setReducerClass(AnalyzeReducerFile.class);
>> job.setNumReduceTasks(1);
>> FileOutputFormat.setOutputPath(job, new
>> Path("hdfs://localhost:54310/output_file"));
>>
>> How do i make the reducer write to a Sequence File instead?
>>
>> I've tried the following code but doesn't work
>>
>> job.setReducerClass(AnalyzeReducerFile.class);
>> job.setNumReduceTasks(1);
>> job.setOutputFormatClass(SequenceFileOutputFormat.class);
>> SequenceFileOutputFormat.setOutputPath(job, new
>> Path("hdfs://localhost:54310/sequenceOutput"));
>>
>> Any help appreciated!
>>
>>
>>
>>
>> --
>> Regards-
>> Pavan
>
>
>
> --
> Harsh J



-- 
Regards-
Pavan

Mime
View raw message