hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mallanagouda Patil <mallanagouda.c.pa...@gmail.com>
Subject Re: Error While copying file from local to dfs
Date Sun, 06 Mar 2016 08:35:10 GMT
Hi Vinod,

Can you try this.
1: core-site.XML
hdfs://localhost
2: restart hadoop stop-dfs.sh and start-dfs.sh
2:try this command
hadoop fs -copyFromLocal sourcefile /
It copies file from Source file to hdfs root.
I hope it helps.

Thanks
Mallan
On Mar 5, 2016 11:11 AM, "Vinodh Nagaraj" <vinodh.dba.c@gmail.com> wrote:

> Hi All,
>
> Please help me.
>
> Thanks & Regards,
> Vinodh.N
>
> On Fri, Mar 4, 2016 at 6:29 PM, Vinodh Nagaraj <vinodh.dba.c@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I am new bee to Hadoop.
>>
>> I installed hadoop 2.7.1 on windows 32 bit machine ( windows 7 ) for
>> learning purpose.
>>
>> I can execute start-all.cmd successfully.
>>
>> When i execute jps,i got the below output.
>> 28544 NameNode
>> 35728
>> 36308 DataNode
>> 43828 Jps
>> 40688 NodeManager
>> 33820 ResourceManager
>>
>> My configuration files are.
>>
>> core-site.xml
>> ---------------------
>> <configuration>
>>  <property>
>>        <name>fs.defaultFS</name>
>>        <value>hdfs://10.219.149.100:50075/</value>
>>        <description>NameNode URI</description>
>>  </property>
>> </configuration>
>>
>>
>>
>> hdfs-site.xml
>> ---------------------
>> <configuration>
>>    <property>
>>      <name>dfs.replication</name>
>>      <value>2</value>
>>     </property>
>>     <property>
>>       <name>dfs.namenode.name.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Data</value>
>>     </property>
>>     <property>
>>      <name>dfs.datanode.data.dir</name>
>>       <value>D:\Hadoop_TEST\Hadoop\Secondary</value>
>>    </property>
>>
>>   <property>
>>       <name>dfs.namenode.datanode.registration.ip-hostname-check</name>
>>
>>      <value>false</value>
>>    </property>
>> </configuration>
>>
>> I tried to copy text file from my locad drive to hdfs file system.but i
>> got the below error.
>>
>> *D:\Hadoop_TEST\Hadoop\ts>hadoop fs -copyFromLocal 4300.txt
>> hdfs://10.219.149.100:50010/a.txt <http://10.219.149.100:50010/a.txt>*
>> *copyFromLocal: End of File Exception between local host is:
>> "PC205172/10.219.149.100 <http://10.219.149.100>"; destination host is:
>> "PC205172.cts.com <http://PC205172.cts.com>":50010; : java.io.EOFException;
>> For more details see:  http://wiki.apache.org/hadoop/EOFException
>> <http://wiki.apache.org/hadoop/EOFException>*
>>
>>
>> Please share your suggestions.
>>
>> How to identify whether i have installed hadoop properly or not
>> how to identify  DATA NODE LOCATION , DATA NODE PORT and others by hdfs
>> or hadoop command
>> how to identify  NAME NODE LOCATION , NAME NODE PROT and its
>>  configuration details by hdfs or hadoop command like  how many replicat
>> etc.
>>
>> Thanks & Regards,
>> Vinodh.N
>>
>>
>>
>

Mime
View raw message