hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Segel <michael_se...@hotmail.com>
Subject Re: Does libhdfs c/c++ api support read/write compressed file
Date Mon, 03 Jun 2013 14:04:36 GMT
Silly question... then what's meant by the native libraries when you talk about compression?


On Jun 3, 2013, at 5:27 AM, Harsh J <harsh@cloudera.com> wrote:

> Hi Xu,
> 
> HDFS is data agnostic. It does not currently care about what form the
> data of the files are in - whether they are compressed, encrypted,
> serialized in format-x, etc..
> 
> There are hadoop-common APIs that support decompressing of supported
> codecs, but there are no C/C++ level implementations of these (though
> you may use JNI). You will have to write/use your own
> decompress/compress code for files.
> 
> On Mon, Jun 3, 2013 at 12:33 PM, Xu Haiti <xuhaiti@yahoo.com> wrote:
>> 
>> I have found somebody talks libhdfs does not support read/write gzip file at
>> about 2010.
>> 
>> I download the newest hadoop-2.0.4 and read hdfs.h. There is also no
>> compressing arguments.
>> 
>> Now I am wondering if it supports reading compressed file now?
>> 
>> If it not, how can I make a patch for the libhdfs and make it work?
>> 
>> Thanks in advance.
>> 
>> Best Regards
>> Haiti
> 
> 
> 
> -- 
> Harsh J
> 


Mime
View raw message