hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vinod Kumar Vavilapalli <vino...@hortonworks.com>
Subject Re: Muliple map writing into same hdfs file
Date Thu, 10 Jul 2014 20:00:52 GMT
Current writes to a single file in HDFS is not possible today. You  may
want to write a per-task file and use that entire directory as your output.

+Vinod
Hortonworks Inc.
http://hortonworks.com/


On Wed, Jul 9, 2014 at 10:42 PM, rab ra <rabmdu@gmail.com> wrote:

>
> hello
>
>
>
> I have one use-case that spans multiple map tasks in hadoop environment. I
> use hadoop 1.2.1 and with 6 task nodes. Each map task writes their output
> into a file stored in hdfs. This file is shared across all the map tasks.
> Though, they all computes thier output but some of them are missing in the
> output file.
>
>
>
> The output file is an excel file with 8 parameters(headings). Each map
> task is supposed to compute all these 8 values, and save it as soon as it
> is computed. This means, the programming logic of a map task opens the
> file, writes the value and close, 8 times.
>
>
>
> Can someone give me a hint on whats going wrong here?
>
>
>
> Is it possible to make more than one map task to write in a shared file in
> HDFS?
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Mime
View raw message