Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4072410A55 for ; Tue, 30 Jul 2013 09:42:52 +0000 (UTC) Received: (qmail 80528 invoked by uid 500); 30 Jul 2013 09:42:46 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 80451 invoked by uid 500); 30 Jul 2013 09:42:46 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 80444 invoked by uid 99); 30 Jul 2013 09:42:45 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Jul 2013 09:42:45 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of justlooks@gmail.com designates 209.85.219.53 as permitted sender) Received: from [209.85.219.53] (HELO mail-oa0-f53.google.com) (209.85.219.53) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 30 Jul 2013 09:42:37 +0000 Received: by mail-oa0-f53.google.com with SMTP id k14so15795851oag.12 for ; Tue, 30 Jul 2013 02:42:17 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=5nAavzudm0WvYJfwhnO1XqBIidU8JqN3ctyExtjNbNw=; b=Bglpx7kc/1TWASBn9Hj2EyAkKQ0eOGNRAxzlXLS3cJE6xz01485FM/kH8SAYzlXabC O/A2REkro059vEItJ1JYfAQ9Lg10EOIORz9yIzuijqm6K6hzp0gFGf8qi1M+lYrhsdBz l3OEvMFy/fLx34zuzhzgqbTVsvwVFJYhpLfenq3z7KxAh6H2Vfxe6LbYeTV5fLGWVEoO nhhmA02WpzWQ6G8y2iWhnZYJmbyyIYfhvsT3DuH2G0djYs/WcuqwMVG+wtmdjSMTpShs /JGrTar3q5B/SwR8TRWKTg+vsGLKO/q8MKCg1sRajGlsEWubroiTicpWTkFEP24lczbb XjAg== MIME-Version: 1.0 X-Received: by 10.60.134.196 with SMTP id pm4mr45992oeb.60.1375177336940; Tue, 30 Jul 2013 02:42:16 -0700 (PDT) Received: by 10.182.103.41 with HTTP; Tue, 30 Jul 2013 02:42:16 -0700 (PDT) Date: Tue, 30 Jul 2013 17:42:16 +0800 Message-ID: Subject: datanode error "Cannot append to a non-existent replica BP-1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_" From: ch huang To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b47207cabeb6604e2b76cbe X-Virus-Checked: Checked by ClamAV on apache.org --047d7b47207cabeb6604e2b76cbe Content-Type: text/plain; charset=ISO-8859-1 i do not know how to solve this,anyone can help 2013-07-30 17:28:40,953 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock BP-1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_458861 received exce ption org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Cannot append to a non-existent replica BP-1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_ 458861 2013-07-30 17:28:40,953 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: CH34:50011:DataXceiver error processing WRITE_BLOCK operation src: /192.168.2.209:4421 dest: /192 .168.10.34:50011 org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Cannot append to a non-existent replica BP-1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_458861 at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.getReplicaInfo(FsDatasetImpl.java:353) at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:489) at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:92) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.(BlockReceiver.java:168) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:451) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:103) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:67) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221) at java.lang.Thread.run(Thread.java:662) 2013-07-30 17:28:40,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1099828917-192.168.10.22-1373361366827:blk_-2057894024775992993_458863 src: /192.168.2 .209:4423 dest: /192.168.10.34:50011 2013-07-30 17:28:40,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock BP-1099828917-192.168.10.22-1373361366827:blk_-2057894024775992993_458863 received exc eption org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Cannot append to a non-existent replica BP-1099828917-192.168.10.22-1373361366827:blk_-205789402477599299 3_458863 2013-07-30 17:28:40,978 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: CH34:50011:DataXceiver error processing WRITE_BLOCK operation src: /192.168.2.209:4423 dest: /192 .168.10.34:50011 org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Cannot append to a non-existent replica BP-1099828917-192.168.10.22-1373361366827:blk_-2057894024775992993_45886 3 at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.getReplicaInfo(FsDatasetImpl.java:353) at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:489) at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:92) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.(BlockReceiver.java:168) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:451) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:103) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:67) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221) at java.lang.Thread.run(Thread.java:662) 2013-07-30 17:28:41,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810267551_458865 src: /192.168.2. 209:4426 dest: /192.168.10.34:50011 2013-07-30 17:28:41,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810267551_458865 received exce ption org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Cannot append to a non-existent replica BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810267551_ 458865 2013-07-30 17:28:41,002 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: CH34:50011:DataXceiver error processing WRITE_BLOCK operation src: /192.168.2.209:4426 dest: /192 .168.10.34:50011 --047d7b47207cabeb6604e2b76cbe Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
i do not know how to solve this,anyone can help
=A0
2013-07-30 17:28:40,953 INFO org.apache.hadoop.hdfs.server.datanode.Da= taNode: opWriteBlock BP-1099828917-192.168.10.22-1373361366827:blk_77962211= 71187533460_458861 received exce
ption org.apache.hadoop.hdfs.server.dat= anode.ReplicaNotFoundException: Cannot append to a non-existent replica BP-= 1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_
458861
2013-07-30 17:28:40,953 ERROR org.apache.hadoop.hdfs.server.datan= ode.DataNode: CH34:50011:DataXceiver error processing WRITE_BLOCK operation= =A0 src: /192.168.2.209:4421 dest= : /192
.168.10.34:50011
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundE= xception: Cannot append to a non-existent replica BP-1099828917-192.168.10.= 22-1373361366827:blk_7796221171187533460_458861
=A0=A0=A0=A0=A0=A0=A0 at= org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.getRep= licaInfo(FsDatasetImpl.java:353)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.fsdataset.i= mpl.FsDatasetImpl.append(FsDatasetImpl.java:489)
=A0=A0=A0=A0=A0=A0=A0 a= t org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.appen= d(FsDatasetImpl.java:92)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.BlockReceiv= er.<init>(BlockReceiver.java:168)
=A0=A0=A0=A0=A0=A0=A0 at org.apa= che.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:451= )
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.datatransfer.= Receiver.opWriteBlock(Receiver.java:103)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.datatransfer.Recei= ver.processOp(Receiver.java:67)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hado= op.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
=A0=A0=A0= =A0=A0=A0=A0 at java.lang.Thread.run(Thread.java:662)
2013-07-30 17:28:40,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNod= e: Receiving BP-1099828917-192.168.10.22-1373361366827:blk_-205789402477599= 2993_458863 src: /192.168.2
.209:4423 dest: /192.168.10.34:50011
2013-07-30 17:28:40,978 INFO org.apache.hadoop.hdfs.server.datanode.DataNod= e: opWriteBlock BP-1099828917-192.168.10.22-1373361366827:blk_-205789402477= 5992993_458863 received exc
eption org.apache.hadoop.hdfs.server.datanod= e.ReplicaNotFoundException: Cannot append to a non-existent replica BP-1099= 828917-192.168.10.22-1373361366827:blk_-205789402477599299
3_458863
2013-07-30 17:28:40,978 ERROR org.apache.hadoop.hdfs.server.dat= anode.DataNode: CH34:50011:DataXceiver error processing WRITE_BLOCK operati= on=A0 src: /192.168.2.209:4423 de= st: /192
.168.10.34:50011
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundE= xception: Cannot append to a non-existent replica BP-1099828917-192.168.10.= 22-1373361366827:blk_-2057894024775992993_45886
3
=A0=A0=A0=A0=A0=A0= =A0 at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.= getReplicaInfo(FsDatasetImpl.java:353)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.fsdataset.i= mpl.FsDatasetImpl.append(FsDatasetImpl.java:489)
=A0=A0=A0=A0=A0=A0=A0 a= t org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.appen= d(FsDatasetImpl.java:92)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.server.datanode.BlockReceiv= er.<init>(BlockReceiver.java:168)
=A0=A0=A0=A0=A0=A0=A0 at org.apa= che.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:451= )
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.datatransfer.= Receiver.opWriteBlock(Receiver.java:103)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.datatransfer.Recei= ver.processOp(Receiver.java:67)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hado= op.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
=A0=A0=A0= =A0=A0=A0=A0 at java.lang.Thread.run(Thread.java:662)
2013-07-30 17:28:41,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNod= e: Receiving BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810267= 551_458865 src: /192.168.2.
209:4426 d= est: /192.168.10.34:50011
2013-07-30 17:28:41,002 INFO org.apache.hadoop.hdfs.server.datanode.DataNod= e: opWriteBlock BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810= 267551_458865 received exce
ption org.apache.hadoop.hdfs.server.datanode= .ReplicaNotFoundException: Cannot append to a non-existent replica BP-10998= 28917-192.168.10.22-1373361366827:blk_7728515140810267551_
458865
2013-07-30 17:28:41,002 ERROR org.apache.hadoop.hdfs.server.datan= ode.DataNode: CH34:50011:DataXceiver error processing WRITE_BLOCK operation= =A0 src: /192.168.2.209:4426 dest= : /192
.168.10.34:50011
--047d7b47207cabeb6604e2b76cbe--