hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gert Pfeifer <pfei...@se.inf.tu-dresden.de>
Subject Re: Re: getting null from CompressionCodecFactory.getCodec(Path file)
Date Wed, 14 Jan 2009 11:18:31 GMT
I got it. For some reason getDefaultExtension() returns ".lzo_deflate".

Is that a bug? Shouldn't it be .lzo?

In the head revision I couldn't find it at all in
http://svn.apache.org/repos/asf/hadoop/core/trunk/src/core/org/apache/hadoop/io/compress/

There should be a Class LzoCodec.java. Was that moved to somewhere else?

Gert

Gert Pfeifer wrote:
> Arun C Murthy wrote:
>> On Jan 13, 2009, at 7:29 AM, Gert Pfeifer wrote:
>>
>>> Hi,
>>> I want to use an lzo file as input for a mapper. The record reader
>>> determines the codec using a CompressionCodecFactory, like this:
>>>
>>> (Hadoop version 0.19.0)
>>>
>> http://hadoop.apache.org/core/docs/r0.19.0/native_libraries.html
> 
> I should have mentioned that I have these native libs running:
> 2009-01-14 10:00:21,107 INFO org.apache.hadoop.util.NativeCodeLoader:
> Loaded the native-hadoop library
> 2009-01-14 10:00:21,111 INFO org.apache.hadoop.io.compress.LzoCodec:
> Successfully loaded & initialized native-lzo library
> 
> Is that what you tried to point out with this link?
> 
> Gert
> 
>> hth,
>> Arun
>>
>>> compressionCodecs = new CompressionCodecFactory(job);
>>> System.out.println("Using codecFactory: "+compressionCodecs.toString());
>>> final CompressionCodec codec = compressionCodecs.getCodec(file);
>>> System.out.println("Using codec: "+codec+" for file "+file.getName());
>>>
>>>
>>> The output that I get is:
>>>
>>> Using codecFactory: { etalfed_ozl.:
>>> org.apache.hadoop.io.compress.LzoCodec }
>>> Using codec: null for file test.lzo
>>>
>>> Of course, the mapper does not work without codec. What could be the
>>> problem?
>>>
>>> Thanks,
>>> Gert

Mime
View raw message