hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arun C Murthy <...@hortonworks.com>
Subject Re: best way to access HDFS from c++
Date Tue, 27 Dec 2011 08:07:19 GMT
I'd recommend trying the HTTP api (webhdfs - http://hortonworks.com/webhdfs-–-http-rest-access-to-hdfs/)
from your C++ application or FUSE-DFS (the C api - http://hadoop.apache.org/common/docs/r0.20.205.0/libhdfs.html).

Arun

PS: Not sure why webhdfs docs on hadoop.apache.org aren't showing up. Maybe they weren't in
0.20.205 and are only fully formed in hadoop-1.0 which just got released. I'll wait for Matt
to update the site.

On Dec 23, 2011, at 1:47 AM, 臧冬松 wrote:

> Hi,
> In my application, I have some big c++ codes. And now I want to store files in HDFS,
and use the c++ programs to read/write from/to HDFS.
> I wonder what's the best way(efficient,high throughput) to do this.
> I know there's a libhdfs which is based on JNI, and a conrib code of Thrift (which is
removed from trunck now).
> But if I really care about performance in reading and writing data, is there other ways
to do this?
> I notice that there's a JIRA(https://issues.apache.org/jira/browse/HDFS-2478) using protobuf
as the IPC protocal, is it a good idea that we can write a client lib in C++ using the new
protobuf IPC?
> 
> Cheers,
> Donal


Mime
View raw message