hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Types and SequenceFiles
Date Fri, 26 Jul 2013 02:15:15 GMT
There is a solution discussed in the mail you've quoted. Did you do
that? Share your code snippets please?

On Fri, Jul 26, 2013 at 5:07 AM, qwerty <ninadphalak@gmail.com> wrote:
> Harsh J <harsh@...> writes:
>
>>
>> Ah, sorry I didn't read the exact problem.
>>
>> Yes that static call you make to addInputPath goes all the way up to
>> (inheritance!) FileInputFormat.addInputPath, which just adds input
>> paths and doesn't automatically imprint itself as the input format
>> class at the same time.
>>
>> On Fri, May 31, 2013 at 9:35 PM, Jens Scheidtmann
>> <jens.scheidtmann@...> wro0te:
>> > Dear Harsh,
>> >
>> >
>> > thanks for your answer. Your post talks about the intermediate and final
>> > result types.
>> > These are already configured in my job as:
>> >         job.setOutputKeyClass(IntWritable.class);
>> >         job.setOutputValueClass(IntWritable.class);
>> >
>> > My problem was input key and value types, though.
>> > Your post let me look in the right direction. I added
>> >
>> >         job.setInputFormatClass(SequenceFileInputFormat.class);
>> >
>> > which did the trick.I thought this would be done by the
>> >
>> >      SequenceFileAsBinaryInputFormat.addInputPath(jobConf, new
>> > Path(args[i]));
>> >
>> > Best regards,
>> >
>> > Jens
>>
>
> Hey, I have a similar problem. I am trying to read a sequence
> file(compressed with snappy)
>
> I did my Mapper class as:
>  public static class Map extends  Mapper<LongWritable, BytesWritable, Text,
> NullWritable>
>
> My map function as :  public void map(LongWritable key, Text value, Context
> context)
>
> When I try to run it says 'Type mismatch in key from map: expected
> org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.LongWritable'
>
> Can anyone please help?
>
>
>
>



-- 
Harsh J

Mime
View raw message