hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From rahul patodi <patodira...@gmail.com>
Subject Re: exceptions copying files into HDFS
Date Sun, 12 Dec 2010 18:09:35 GMT
Sanford,.
I have read ur previous posts, also blog URL given by me also contain
configuration for running hadoop in pseudo distributed mode
Also the exception you are getting is because your data node is down
I would suggest please start from scratch
*To be more specific* if you need quick install tutorial:
for hadoop:
http://hadoop-tutorial.blogspot.com/2010/11/running-hadoop-in-pseudo-distributed.html
for cloudera:
http://cloudera-tutorial.blogspot.com/2010/11/running-cloudera-in-pseudo-distributed.html

On Sun, Dec 12, 2010 at 11:12 PM, Sanford Rockowitz
<rockowitz@minsoft.com>wrote:

> Rahul,
>
> I should have been more explicit.  I am simply trying to run in
> pseudo-distributed mode.   For further comments, see my previous post to
> Varadharajan.
>
> Thanks,
> Sanford
>
>
> On 12/12/2010 2:24 AM, rahul patodi wrote:
>
>> you can follow this tutorial:
>>
>>
>> http://hadoop-tutorial.blogspot.com/2010/11/running-hadoop-in-distributed-mode.html
>>
>> http://cloudera-tutorial.blogspot.com/2010/11/running-cloudera-in-distributed-mode.html
>> also, before running any job please ensure all the required processes are
>> running on the correct node
>> like on master:
>> Namenode, jobtracker, secondarynamenode(if you are not running secondary
>> name node on another system)
>>
>> on slave:
>> datanode, tasktracker
>>
>>
>> On Sun, Dec 12, 2010 at 2:46 PM, Varadharajan Mukundan<
>> srinathsmn@gmail.com
>>
>>> wrote:
>>> HI,
>>>
>>>  jps reports DataNode, NameNode, and SecondayNameNode as running:
>>>>
>>>> rock@ritter:/tmp/hadoop-rock>  jps
>>>> 31177 Jps
>>>> 29909 DataNode
>>>> 29751 NameNode
>>>> 30052 SecondaryNameNode
>>>>
>>> In master node, the output of the "JPS" will contain a tasktracker,
>>> jobtracker, namenode, secondary namenode, datanode(optional, depending on
>>> your config) and your slaves will have tasktracker, datanodes in their
>>> jps
>>> output. If you need more help on configuring hadoop, i recommend you to
>>> take
>>> a look at
>>>
>>>
>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>>>
>>>
>>>
>>>
>>>  Here is the contents of the Hadoop node tree.  The only thing that looks
>>>> like a log file are the dncp_block_verification.log.curr files, and
>>>> those
>>>> are empty.
>>>> Note the presence of the in_use.lock files, which suggests that this
>>>> node
>>>>
>>> is
>>>
>>>> indeed being used.
>>>>
>>>
>>> The logs will be in the "logs" directory in $HADOOP_HOME (hadoop home
>>> directory), are you looking for logs in this directory?
>>>
>>>
>>> --
>>> Thanks,
>>> M. Varadharajan
>>>
>>> ------------------------------------------------
>>>
>>> "Experience is what you get when you didn't get what you wanted"
>>>               -By Prof. Randy Pausch in "The Last Lecture"
>>>
>>> My Journal :- www.thinkasgeek.wordpress.com
>>>
>>>
>>
>>
>


-- 
*Regards*,
Rahul Patodi
Associate Software Engineer,
Impetus Infotech (India) Pvt Ltd,
www.impetus.com
Mob:09907074413

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message