hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vinayakumar B <vinayakuma...@huawei.com>
Subject RE: Which is hdfs?
Date Mon, 09 Dec 2013 09:31:16 GMT
Hi,

Please find the answers inline, if its helpful.

Thanks and Regargds,
Vinayakumar B

From: unmesha sreeveni [mailto:unmeshabiju@gmail.com]
Sent: 09 December 2013 12:52
To: User Hadoop
Subject: Which is hdfs?

Can anyone tell me what is the difference between the below details

My cluster is a remote system "sree".

If you have set the fs.defaultFS as "hdfs://<namenode address>", then -copyFromLocal
will copy to hdfs.

1. I have a "chck" file in my /home/sree
   I did
            > hadoop fs -copFromLocal /home/sree/chck
            > hadoop fs -ls

                       -rw-r--r--   1 sree     supergroup         32 2013-12-03 14:27 chck

I think you have done above operation from sree user.  This is why -ls showing sree as the
owner of the file.
Since here you didn't pass the destination, by default file will be copied under /user/sree
directory. -ls also shows from same directory.
To make sure, you do '-ls /user/sree'

                 whether chck file is now resided in hdfs?
2.After executing wordcount in my remote system my output folder looks like this

                   drwxr-xr-x   - hdfs      supergroup          0 2013-11-19 09:41 wcout

WordCount job is executing from the user hdfs, so -ls for the wcount is showing hdfs as owner.

I have a confusion -  which is hdfs?

In my opinion, both files are in hdfs but under different user homes.

The area where chck resided or wcout ?

3. Am i able to update/append "chck" file through MR job?
HDFS supports only append from one client at a time.

4.  -rw-r--r--   1 hdfs     supergroup         32 2013-12-03 14:27 myfile
    Am i able to update/append "myfile" file through MR job?


Basically here you can update the  file by creating the same file again. Updation to same
file is not possible.

**I read that updation is not allowed in hdfs**

--
Thanks & Regards

Unmesha Sreeveni U.B
Junior Developer



Mime
View raw message