hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From nipun_mlist Assam <nipunml...@gmail.com>
Subject Re: problem in copying files
Date Tue, 29 Sep 2009 08:04:02 GMT
>> I recommend that the first command you run after all daemons are formatted and started
is to create your home directory (before u upload files):
Yah. It worked.

On Tue, Sep 29, 2009 at 1:11 PM, Dhruba Borthakur <dhruba@gmail.com> wrote:
> I recommend that the first command you run after all daemons are formatted
> and started is to create your home directory (before u upload files):
>
> $hadoop dfs -ls
> ls: Cannot access .: No such file or directory.
> $hadoop dfs -mkdir /user/ninput
> $ hadoop dfs -copyFromLocal      .vimrc    .
> $ hadoop dfs -ls
> Found 1 items
> -rw-r--r--   3 nipunt supergroup       3051 2009-09-28 23:14
> /user/ninput/.vimrc
> $ hadoop dfs -copyFromLocal    .bashrc     .
>
>
> On Mon, Sep 28, 2009 at 11:16 PM, nipun_mlist Assam <nipunmlist@gmail.com>
> wrote:
>>
>> Hi All,
>>
>> I have installed hadoop 0.20.1 on my system and set up a pseudo
>> cluster configuration.
>> Below is the core-site.xml I am using:
>>
>> <?xml version="1.0"?>
>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> <configuration>
>> <property>
>>  <name>hadoop.tmp.dir</name>
>>  <value>/tmp/hadoop-${user.name}</value>
>> </property>
>> <property>
>>  <name>fs.default.name</name>
>>  <value>hdfs://localhost:54310</value>
>> </property>
>> <property>
>>  <name>mapred.job.tracker</name>
>>  <value>hdfs://localhost:54311</value>
>> </property>
>> <property>
>>  <name>dfs.replication</name>
>>  <value>1</value>
>> </property>
>> <property>
>>  <name>mapred.child.java.opts</name>
>>  <value>-Xmx512m</value>
>> </property>
>> </configuration>
>>
>>
>> Formatted hdfs using  command "$ hadoop namenode -format" and I get no
>> error.
>> Then I started-all components using  "$ start-all.sh" and yet I get no
>> error.
>> Env variables HADOOP_HOME etc are correctly defined.
>>
>> Now I get the following errors:
>>
>> $hadoop dfs -ls
>> ls: Cannot access .: No such file or directory.
>>
>> $ hadoop dfs -copyFromLocal      .vimrc    .
>> $ hadoop dfs -ls
>> Found 1 items
>> -rw-r--r--   3 nipunt supergroup       3051 2009-09-28 23:14 /user/nipunt
>> $ hadoop dfs -copyFromLocal    .bashrc     .
>> copyFromLocal: Target  already exists
>>
>> What is the root cause of the problem ?
>> How to overcome it ?
>
>
>
> --
> Connect to me at http://www.facebook.com/dhruba
>

Mime
View raw message