hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dhanasekaran Anbalagan <bugcy...@gmail.com>
Subject Re: DataXceiver error processing WRITE_BLOCK operation src: /x.x.x.x:50373 dest: /x.x.x.x:50010
Date Fri, 08 Mar 2013 09:42:57 GMT
Hi Varun

I believe is not ulimit issue.


/etc/security/limits.conf
# End of file
*               -      nofile          1000000
*               -      nproc           1000000


please guide me Guys, I want fix this. share your thoughts DataXceiver
error.

Did I learn something today? If not, I wasted it.


On Fri, Mar 8, 2013 at 3:50 AM, varun kumar <varun.uid@gmail.com> wrote:

> Hi Dhana,
>
> Increase the ulimit for all the datanodes.
>
> If you are starting the service using hadoop increase the ulimit value for
> hadoop user.
>
> Do the  changes in the following file.
>
> */etc/security/limits.conf*
>
> Example:-
> *hadoop          soft    nofile          35000*
> *hadoop          hard    nofile          35000*
>
> Regards,
> Varun Kumar.P
>
> On Fri, Mar 8, 2013 at 1:15 PM, Dhanasekaran Anbalagan <bugcy013@gmail.com
> > wrote:
>
>> Hi Guys
>>
>> I am frequently getting is error in my Data nodes.
>>
>> Please guide what is the exact problem this.
>>
>>
>> dvcliftonhera138:50010:DataXceiver error processing WRITE_BLOCK operation src: /172.16.30.138:50373
dest: /172.16.30.138:50010
>>
>>
>> java.net.SocketTimeoutException: 70000 millis timeout while waiting for channel to
be ready for read. ch : java.nio.channels.SocketChannel[connected local=/172.16.30.138:34280
remote=/172.16.30.140:50010]
>>
>>
>>
>>
>> at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:154)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:127)
>>
>>
>>
>>
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:115)
>> at java.io.FilterInputStream.read(FilterInputStream.java:66)
>> at java.io.FilterInputStream.read(FilterInputStream.java:66)
>> at org.apache.hadoop.hdfs.protocol.HdfsProtoUtil.vintPrefixed(HdfsProtoUtil.java:160)
>>
>>
>>
>>
>> at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:405)
>> at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:98)
>> at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:66)
>>
>>
>>
>>
>> at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:189)
>> at java.lang.Thread.run(Thread.java:662)
>>
>>
>>
>> dvcliftonhera138:50010:DataXceiver error processing WRITE_BLOCK operation src: /172.16.30.138:50531
dest: /172.16.30.138:50010
>>
>>
>> java.io.EOFException: while trying to read 65563 bytes
>>
>>
>> at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:408)
>> at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:452)
>> at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:511)
>>
>>
>>
>>
>> at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:748)
>> at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:462)
>> at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:98)
>>
>>
>>
>>
>> at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:66)
>> at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:189)
>> at java.lang.Thread.run(Thread.java:662)
>>
>>
>>
>>
>> How to resolve this.
>>
>> -Dhanasekaran.
>>
>> Did I learn something today? If not, I wasted it.
>>
>>  --
>>
>>
>>
>>
>
>
>
> --
> Regards,
> Varun Kumar.P
>

Mime
View raw message