hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stanley Shi <s...@pivotal.io>
Subject Re: Local file system to access hdfs blocks
Date Wed, 27 Aug 2014 04:14:15 GMT
I am not sure this is what you want but you can try this shell command:

find [DATANODE_DIR] -name [blockname]


On Tue, Aug 26, 2014 at 6:42 AM, Demai Ni <nidmgg@gmail.com> wrote:

> Hi, folks,
>
> New in this area. Hopefully to get a couple pointers.
>
> I am using Centos and have Hadoop set up using cdh5.1(Hadoop 2.3)
>
> I am wondering whether there is a interface to get each hdfs block
> information in the term of local file system.
>
> For example, I can use "Hadoop fsck /tmp/test.txt -files -blocks -racks"
> to get blockID and its replica on the nodes, such as: repl =3[
> /rack/hdfs01, /rack/hdfs02...]
>
>  With such info, is there a way to
> 1) login to hfds01, and read the block directly at local file system level?
>
>
> Thanks
>
> Demai on the run




-- 
Regards,
*Stanley Shi,*

Mime
View raw message