New in this area. Hopefully to get a couple pointers.
I am using Centos and have Hadoop set up using cdh5.1(Hadoop 2.3)
I am wondering whether there is a interface to get each hdfs block information in the term of local file system.
For example, I can use "Hadoop fsck /tmp/test.txt -files -blocks -racks" to get blockID and its replica on the nodes, such as: repl =3[ /rack/hdfs01, /rack/hdfs02...]
With such info, is there a way to
1) login to hfds01, and read the block directly at local file system level?
Demai on the run