hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xavier Stevens" <Xavier.Stev...@fox.com>
Subject Problem retrieving entry from compressed MapFile
Date Thu, 13 Mar 2008 20:43:10 GMT
Currently I can retrieve entries if I use MapFileOutputFormat via conf.setOutputFormat with
no compression specified.  But I was trying to do this:

public void configure(JobConf jobConf) {
this.writer = new MapFile.Writer(jobConf, fileSys, dirName, Text.class, Text.class, SequenceFile.CompressionType.BLOCK);

public void map(WritableComparable key, Writable value,
                        OutputCollector output, Reporter reporter) throws IOException {

To use SequenceFile block compression.  Then later trying to retrieve the output values in
a separate class:

public static void main(String[] args) throws Exception {
MapFile.Reader[] readers = MapFileOutputFormat.getReaders(fileSys, inDataPath, defaults);
Partitioner part = (Partitioner)ReflectionUtils.newInstance(conf.getPartitionerClass(), conf);
Text entryValue = null;
entryValue = (Text)MapFileOutputFormat.getEntry(readers, part, new Text("mykey"), new Text());
if (entryValue != null) {
	System.out.println("My Entry's Value: ");
for (MapFile.Reader reader:readers) {
	if (reader != null) {

But when I use block compression I no longer get a result from MapFileOutputFormat.getEntry.
 What am I doing wrong?  And/or is there a way for this to work using conf.setOutputFormat(MapFileOutputFormat.class)
and conf.setMapOutputCompressionType(SequenceFile.CompressionType.BLOCK)? 

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message