hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jothi A <>
Subject compressions in Hadoop
Date Fri, 27 May 2011 09:59:34 GMT
I'm working with compressions in Hadoop,idea is to compress the file in
Linux then to process the same in Hadoop and to decompress it to Linux. I'm
able to do this for gzip and bzip2 *but when i tried for default
compression, I'm able to compress the file but it didn't work for decompress
i.e, got the file in some unreadable format.*so please clarify me as what
went wrong.

To compress      : compress -c file.txt > file.txt.Z
To Decompress  : time decompress test.Z ( test.Z is the output file got
after processing from Hadoop )


View raw message