hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Does libhdfs c/c++ api support read/write compressed file
Date Mon, 03 Jun 2013 10:27:07 GMT
Hi Xu,

HDFS is data agnostic. It does not currently care about what form the
data of the files are in - whether they are compressed, encrypted,
serialized in format-x, etc..

There are hadoop-common APIs that support decompressing of supported
codecs, but there are no C/C++ level implementations of these (though
you may use JNI). You will have to write/use your own
decompress/compress code for files.

On Mon, Jun 3, 2013 at 12:33 PM, Xu Haiti <xuhaiti@yahoo.com> wrote:
> I have found somebody talks libhdfs does not support read/write gzip file at
> about 2010.
> I download the newest hadoop-2.0.4 and read hdfs.h. There is also no
> compressing arguments.
> Now I am wondering if it supports reading compressed file now?
> If it not, how can I make a patch for the libhdfs and make it work?
> Thanks in advance.
> Best Regards
> Haiti

Harsh J

View raw message