hadoop-hdfs-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aaron Kimball <aa...@cloudera.com>
Subject Re: create error
Date Mon, 05 Jul 2010 07:49:08 GMT
Is there a reason you're using that particular interface? That's very
low-level.

See http://wiki.apache.org/hadoop/HadoopDfsReadWriteExample for the proper
API to use.

- Aaron

On Sat, Jul 3, 2010 at 1:36 AM, Vidur Goyal <vidur@students.iiit.ac.in>wrote:

> Hi,
>
> I am trying to create a file in hdfs . I am calling create from an
> instance of DFSClient. This is a part of code that i am using
>
> byte[] buf = new byte[65536];
>                        int len;
>                        while ((len = dis.available()) != 0) {
>                                if (len < buf.length) {
>                                        break;
>                                } else {
>                                        dis.read(buf, 0, buf.length);
>                                        ds.write(buf, 0, buf.length);
>                                }
>                        }
>
> dis is DataInputStream for the local file system from which i am copying
> the file and ds is the DataOutputStream to hdfs.
>
> and i get these errors.
>
> 2010-07-03 13:45:07,480 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode:
> DatanodeRegistration(127.0.0.1:50010,
> storageID=DS-455297472-127.0.0.1-50010-1278144155322, infoPort=50075,
> ipcPort=50020):DataXceiver
> java.io.EOFException: while trying to read 65557 bytes
>        at
>
> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readToBuf(BlockReceiver.java:265)
>        at
>
> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.readNextPacket(BlockReceiver.java:309)
>        at
>
> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:373)
>        at
>
> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:525)
>        at
>
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:357)
>        at
>
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
>        at java.lang.Thread.run(Thread.java:636)
>
>
> When i run the loop for number of times that is a multiple of block size ,
> the operation runs just fine. As soon as i change the buffer array size to
> a non block size , it starts giving errors.
> I am in middle of a project . Any help will be appreciated.
>
> thanks
> vidur
>
> --
> This message has been scanned for viruses and
> dangerous content by MailScanner, and is
> believed to be clean.
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message