hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Prakhar Sharma <prakhar.sha...@gmail.com>
Subject Re: writing files to HDFS (from c++/pipes)
Date Wed, 09 Dec 2009 01:53:45 GMT
Hi Owen,
"It also provides the entire job configuration as a string->string map."
Can you provide some example as to how to do this?. I am trying to
write a DNA sequence assembler using Hadoop MapReduce to improve the
throughput of the assembler. I have to call runTask() repeatedly with
different settings for different invocations and am not clear how to
do so.
(reason for my comments was that I had a hard time making the Pipes
api and libhdfs work, and still am not clear about some use cases)


On Mon, Dec 7, 2009 at 6:30 PM, Owen O'Malley <omalley@apache.org> wrote:
> On Dec 7, 2009, at 10:05 AM, horson wrote:
>> i want to write a file to hdfs, using hadoop pipes. can anyone tell me how
>> to do that?
> You either use a Java OutputFormat, which is the easiest, or you use libhdfs
> to write to HDFS from C++.
>> i looked at the hadoop pipes source and it looked very restricted, can i
>> do everything in
>> hadoop pipes that possible in java?
> No, not everything is supported. It does support record readers, mappers,
> combiners, reducers, record writers, and counters from C++. It also provides
> the entire job configuration as a string->string map.
> -- Owen

View raw message