hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gokulakannan M <gok...@huawei.com>
Subject RE: exceptions i got in HDFS - append problem?
Date Mon, 12 Apr 2010 08:41:34 GMT
	

@Stack,

>> what hadoop version are you running?  hdfs-265 won't apply to hadoop
>> 0.20.x if that is what you are running	

	I am using hadoop 0.20.1. So hdfs-265 cannot be applied to it?
hmmmm.

>> Do you have hdfs-200 and friends applied to your cluster?

	No I haven't applied them yet. Can you specify the patches other
than HDFS-200.

>> I haven't looked, but my guess is that scribe documentation probably
>> has description of the patchset required to run on hadoop.
 
	No they are not mentioned :(

   Regards,
   Gokul
 

On Fri, Apr 9, 2010 at 3:07 AM, Gokulakannan M <gokulm@huawei.com> wrote:
> Hi,
>  I got the following exceptions , when I am using HDFS to write the logs
> coming from Scribe
>  1. java.io.IOException: Filesystem closed
>
>      <stack trace>
>      ........
>      ........
>      call to org.apache.hadoop.fs.FSDataOutputStream::write failed!
>

Above seems to be saying that filesystem is closed and as a
consequence, you are not able to write it.

>  2. org.apache.hadoop.ipc.RemoteException:
> org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to
> create
>       file xxx-2010-04-01-12-40_00000 for DFSClient_1355960219 on client
> 10.18.22.55 because current leaseholder is trying to recreate file
>       <stack trace>
>      ........
>      ........
>      call to
>
org.apache.hadoop.conf.FileSystem::append((Lorg/apache/hadoop/fs/Path;)Lorg/
apache/hadoop/fs/FSDataOutputStream;)failed!
>

Someone holds the lease on the file you are trying to open?

You mention scribe.  Do you have hdfs-200 and friends applied to your
cluster?

>   I didn't apply the HDFS-265 to my hadoop patch yet.
>

What hadoop version are you running?  hdfs-265 won't apply to hadoop
0.20.x if that is what you are running.

>
>   Are these exceptions due to the bugs in existing append-feature?? or
some
> other reason?
>
>  Should I need to apply the complete append patch or a simple patch will
> solve this.
>
I haven't looked, but my guess is that scribe documentation probably
has description of the patchset required to run on hadoop.

St.Ack


Mime
View raw message