hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Irfan Sayed <irfu.sa...@gmail.com>
Subject Re: about replication
Date Mon, 19 Aug 2013 04:15:03 GMT
please suggest

regards
irfan



On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <irfu.sayed@gmail.com> wrote:

> hey Tariq,
> i am still stuck ..
> can you please suggest
>
> regards
> irfan
>
>
>
> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <irfu.sayed@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>>
>>
>>
>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <irfu.sayed@gmail.com> wrote:
>>
>>> attachment got quarantined
>>> resending in txt format. please rename it to conf.rar
>>>
>>> regards
>>>
>>>
>>>
>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <irfu.sayed@gmail.com>wrote:
>>>
>>>> thanks.
>>>>
>>>> if i run the jps command on namenode :
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>> $ ./jps.exe
>>>> 3164 NameNode
>>>> 1892 Jps
>>>>
>>>> same command on datanode :
>>>>
>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin
>>>> $ ./jps.exe
>>>> 3848 Jps
>>>>
>>>> jps does not list any process for datanode. however, on web browser i
>>>> can see one live data node
>>>> please find the attached conf rar file of namenode
>>>>
>>>> regards
>>>>
>>>>
>>>>
>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <dontariq@gmail.com>wrote:
>>>>
>>>>> OK. we'll start fresh. Could you plz show me your latest config files?
>>>>>
>>>>> BTW, are your daemons running fine?Use JPS to verify that.
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <irfu.sayed@gmail.com>wrote:
>>>>>
>>>>>> i have created these dir "wksp_data" and "wksp_name" on both datanode
>>>>>> and namenode
>>>>>> made the respective changes in "hdfs-site.xml" file
>>>>>> formatted the namenode
>>>>>> started the dfs
>>>>>>
>>>>>> but still, not able to browse the file system through web browser
>>>>>> please refer below
>>>>>>
>>>>>> anything still missing ?
>>>>>> please suggest
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <irfu.sayed@gmail.com>wrote:
>>>>>>
>>>>>>> these dir needs to be created on all datanodes and namenodes
?
>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes
and
>>>>>>> namenodes for these new dir?
>>>>>>>
>>>>>>> regards
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <dontariq@gmail.com>wrote:
>>>>>>>
>>>>>>>> Create 2 directories manually corresponding to the values
of
>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions
of
>>>>>>>> these directories to 755. When you start pushing data into
your HDFS, data
>>>>>>>> will start going inside the directory specified by dfs.data.dir
and the
>>>>>>>> associated metadata will go inside dfs.name.dir. Remember,
you store data
>>>>>>>> in HDFS, but it eventually gets stored in your local/native
FS. But you
>>>>>>>> cannot see this data directly on your local/native FS.
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <irfu.sayed@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> however, i need this to be working on windows environment
as
>>>>>>>>> project requirement.
>>>>>>>>> i will add/work on Linux later
>>>>>>>>>
>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system
OR do i
>>>>>>>>> need to create it from command line ?
>>>>>>>>>
>>>>>>>>> please suggest
>>>>>>>>>
>>>>>>>>> regards,
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> Hello Irfan,
>>>>>>>>>>
>>>>>>>>>> Sorry for being unresponsive. Got stuck with some
imp work.
>>>>>>>>>>
>>>>>>>>>> HDFS webUI doesn't provide us the ability to create
file or
>>>>>>>>>> directory. You can browse HDFS, view files, download
files etc. But
>>>>>>>>>> operation like create, move, copy etc are not supported.
>>>>>>>>>>
>>>>>>>>>> These values look fine to me.
>>>>>>>>>>
>>>>>>>>>> One suggestion though. Try getting a Linux machine(if
possible).
>>>>>>>>>> Or at least use a VM. I personally feel that using
Hadoop on windows is
>>>>>>>>>> always messy.
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>>> > wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> when i browse the file system , i am getting
following :
>>>>>>>>>>> i haven't seen any make directory option there
>>>>>>>>>>>
>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>> further, in the hdfs-site.xml file , i have given
following
>>>>>>>>>>> entries. are they correct ?
>>>>>>>>>>>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>   </property>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>   </property>
>>>>>>>>>>>
>>>>>>>>>>> please suggest
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani
<
>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>
>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar
/wksp
>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar
does not exist.
>>>>>>>>>>>>
>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz
/wksp
>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz
does not exist.
>>>>>>>>>>>>
>>>>>>>>>>>> Because,You had wrote both the paths local
and You need not to
>>>>>>>>>>>> copy hadoop into hdfs...Hadoop is already
working..
>>>>>>>>>>>>
>>>>>>>>>>>> Just check out in browser by after starting
ur single node
>>>>>>>>>>>> cluster :
>>>>>>>>>>>>
>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>
>>>>>>>>>>>> then go for browse the filesystem link in
it..
>>>>>>>>>>>>
>>>>>>>>>>>> If there is no directory then make directory
there.
>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>> Then copy any text file there(no need to
copy hadoop
>>>>>>>>>>>> there).beacause u are going to do processing
on that data in text
>>>>>>>>>>>> file.That's why hadoop is used for ,first
u need to make it clear in ur
>>>>>>>>>>>> mind.Then and then u will do it...otherwise
not possible..
>>>>>>>>>>>>
>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>
>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>>>>> /hdfs/directory/path
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed
<
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>
>>>>>>>>>>>>> let me surely refer the doc and link
which u sent but i need
>>>>>>>>>>>>> this to be working ...
>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>> -THANX
>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Mime
View raw message