Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 31D5C9C78 for ; Fri, 7 Oct 2011 08:45:14 +0000 (UTC) Received: (qmail 62896 invoked by uid 500); 7 Oct 2011 08:45:13 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 62749 invoked by uid 500); 7 Oct 2011 08:45:12 -0000 Mailing-List: contact hdfs-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-user@hadoop.apache.org Delivered-To: mailing list hdfs-user@hadoop.apache.org Received: (qmail 62662 invoked by uid 99); 7 Oct 2011 08:45:12 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 07 Oct 2011 08:45:12 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of kiranprasad.g@imimobile.com designates 203.199.178.220 as permitted sender) Received: from [203.199.178.220] (HELO IMIMAIL03.imidomain.com) (203.199.178.220) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 07 Oct 2011 08:45:05 +0000 Received: from kiranprasadg ([10.0.1.235]) by IMIMAIL03.imidomain.com with Microsoft SMTPSVC(6.0.3790.4675); Fri, 7 Oct 2011 14:16:10 +0530 Message-ID: <9C1CD4FA919544A5B4AEF0E77F4F36B7@imidomain.com> From: "kiranprasad" To: Subject: ERROR 1066: Unable to open iterator for alias A. Backend error : Could not obtain block: Date: Fri, 7 Oct 2011 14:16:07 +0530 MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_NextPart_000_02B8_01CC84FB.A7E963D0" X-Priority: 3 X-MSMail-Priority: Normal Importance: Normal X-Mailer: Microsoft Windows Live Mail 15.4.3508.1109 X-MimeOLE: Produced By Microsoft MimeOLE V15.4.3508.1109 X-OriginalArrivalTime: 07 Oct 2011 08:46:10.0003 (UTC) FILETIME=[8FA52A30:01CC84CD] X-Virus-Checked: Checked by ClamAV on apache.org This is a multi-part message in MIME format. ------=_NextPart_000_02B8_01CC84FB.A7E963D0 Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable Hi =20 I ve checked with below mentioned command and I am getting [kiranprasad.g@pig4 hadoop-0.20.2]$ bin/hadoop fs -text = /data/arpumsisdn.txt | tail 11/10/07 16:17:18 INFO hdfs.DFSClient: No node available for block:=20 blk_-8354424441116992221_1060 file=3D/data/arpumsisdn.txt 11/10/07 16:17:18 INFO hdfs.DFSClient: Could not obtain block=20 blk_-8354424441116992221_1060 from any node: java.io.IOException: No = live=20 nodes contain current block 11/10/07 16:17:21 INFO hdfs.DFSClient: No node available for block:=20 blk_-8354424441116992221_1060 file=3D/data/arpumsisdn.txt 11/10/07 16:17:21 INFO hdfs.DFSClient: Could not obtain block=20 blk_-8354424441116992221_1060 from any node: java.io.IOException: No = live=20 nodes contain current block 11/10/07 16:17:25 INFO hdfs.DFSClient: No node available for block:=20 blk_-8354424441116992221_1060 file=3D/data/arpumsisdn.txt 11/10/07 16:17:25 INFO hdfs.DFSClient: Could not obtain block=20 blk_-8354424441116992221_1060 from any node: java.io.IOException: No = live=20 nodes contain current block 11/10/07 16:17:29 WARN hdfs.DFSClient: DFS Read: java.io.IOException: = Could=20 not obtain block: blk_-8354424441116992221_1060 = file=3D/data/arpumsisdn.txt at=20 org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.= java:1812) at=20 org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.jav= a:1638) at=20 org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)= at=20 org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1695)= at java.io.DataInputStream.readShort(DataInputStream.java:295) at org.apache.hadoop.fs.FsShell.forMagic(FsShell.java:397) at org.apache.hadoop.fs.FsShell.access$200(FsShell.java:49) at org.apache.hadoop.fs.FsShell$2.process(FsShell.java:420) at=20 org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(FsSh= ell.java:1898) at org.apache.hadoop.fs.FsShell.text(FsShell.java:414) at org.apache.hadoop.fs.FsShell.doall(FsShell.java:1563) at org.apache.hadoop.fs.FsShell.run(FsShell.java:1763) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880) text: Could not obtain block: blk_-8354424441116992221_1060=20 file=3D/data/arpumsisdn.txt The Block is not available. How to recover the data block ? -----Original Message-----=20 From: Alex Rovner Sent: Wednesday, October 05, 2011 5:55 PM To: user@pig.apache.org Subject: Re: ERROR 1066: Unable to open iterator for alias A. Backend = error=20 : Could not obtain block: You can also test quickly if thats the issue by running the following command: hadoop fs -text /data/arpumsisdn.txt | tail On Wed, Oct 5, 2011 at 8:24 AM, Alex Rovner = wrote: Kiran, This looks like your HDFS is missing some blocks. Can you run fsck and = see if you have missing blocks and if so for what files? http://hadoop.apache.org/common/docs/r0.17.2/hdfs_user_guide.html#Fsck Alex On Tue, Oct 4, 2011 at 7:53 AM, kiranprasad=20 wrote: I am getting the below exception when trying to execute PIG latin = script. Failed! Failed Jobs: JobId Alias Feature Message Outputs job_201110042009_0005 A MAP_ONLY Message: Job failed! hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019, Input(s): Failed to read data from "/data/arpumsisdn.txt" Output(s): Failed to produce result in "hdfs:// 10.0.0.61/tmp/temp1751671187/tmp-592386019" Counters: Total records written : 0 Total bytes written : 0 Spillable Memory Manager spill count : 0 Total bags proactively spilled: 0 Total records proactively spilled: 0 Job DAG: job_201110042009_0005 2011-10-04 22:13:53,736 [main] INFO =20 org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLau= ncher - Failed! 2011-10-04 22:13:53,745 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias A. Backend error : Could=20 not obtain block: blk_-8354424441116992221_1060 file=3D/data/arpumsisdn.txt Details at logfile: /home/kiranprasad.g/pig-0.8.1/pig_1317746514798.log Regards Kiran.G ------=_NextPart_000_02B8_01CC84FB.A7E963D0 Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable
Hi 
 
 
I ve checked with below mentioned=20 command  and I am getting

[kiranprasad.g@pig4 = hadoop-0.20.2]$=20 bin/hadoop fs -text /data/arpumsisdn.txt | = tail
11/10/07 16:17:18 INFO = hdfs.DFSClient: No node=20 available for block:
blk_-8354424441116992221_1060=20 file=3D/data/arpumsisdn.txt
11/10/07 16:17:18 INFO hdfs.DFSClient: = Could not=20 obtain block
blk_-8354424441116992221_1060 from any node: =20 java.io.IOException: No live
nodes contain current block
11/10/07 = 16:17:21 INFO hdfs.DFSClient: No node available for block:=20
blk_-8354424441116992221_1060 = file=3D/data/arpumsisdn.txt
11/10/07 16:17:21=20 INFO hdfs.DFSClient: Could not obtain block =
blk_-8354424441116992221_1060=20 from any node:  java.io.IOException: No live
nodes contain = current=20 block
11/10/07 16:17:25 INFO hdfs.DFSClient: No node available for = block:=20
blk_-8354424441116992221_1060 = file=3D/data/arpumsisdn.txt
11/10/07 16:17:25=20 INFO hdfs.DFSClient: Could not obtain block =
blk_-8354424441116992221_1060=20 from any node:  java.io.IOException: No live
nodes contain = current=20 block
11/10/07 16:17:29 WARN hdfs.DFSClient: DFS Read: = java.io.IOException:=20 Could
not obtain block: blk_-8354424441116992221_1060=20 file=3D/data/arpumsisdn.txt
       = at=20
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSCli= ent.java:1812)
       =20 at=20
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient= .java:1638)
       =20 at=20
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1= 767)
       =20 at=20
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1= 695)
       =20 at=20 java.io.DataInputStream.readShort(DataInputStream.java:295)
 &nbs= p;     =20 at=20 org.apache.hadoop.fs.FsShell.forMagic(FsShell.java:397)
  &n= bsp;    =20 at=20 org.apache.hadoop.fs.FsShell.access$200(FsShell.java:49)
  &= nbsp;    =20 at=20 org.apache.hadoop.fs.FsShell$2.process(FsShell.java:420)
  &= nbsp;    =20 at=20
org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(= FsShell.java:1898)
       =20 at=20 org.apache.hadoop.fs.FsShell.text(FsShell.java:414)
   =     =20 at=20 org.apache.hadoop.fs.FsShell.doall(FsShell.java:1563)
  &nbs= p;    =20 at=20 org.apache.hadoop.fs.FsShell.run(FsShell.java:1763)
   =     =20 at=20 org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
  =      =20 at=20 org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
  =      =20 at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)

text: = Could not=20 obtain block: blk_-8354424441116992221_1060=20
file=3D/data/arpumsisdn.txt


The Block is not = available.  How=20 to recover the data block ?


-----Original Message----- =
From: Alex=20 Rovner
Sent: Wednesday, October 05, 2011 5:55 PM
To:
user@pig.apache.org
Subject: Re: ERROR 1066: Unable to open = iterator for=20 alias A. Backend error
: Could not obtain block:

You can also = test=20 quickly if thats the issue by running the = following
command:

hadoop fs=20 -text /data/arpumsisdn.txt | tail

On Wed, Oct 5, 2011 at 8:24 AM, = Alex=20 Rovner <alexrovner@gmail.com> wrote:

Kiran,

This looks = like your=20 HDFS is missing some blocks. Can you run fsck and see
if you have = missing=20 blocks and if so for what files?

http://hadoop.apache.org/common/docs/r0.17.2/hdfs_user_guide.html#= Fsck

Alex


On Tue, Oct 4, 2011 at 7:53 = AM,=20 kiranprasad
<wrote">kiranprasad.g@imimobile.com>wrote:

I am getting the below exception when = trying to=20 execute PIG latin script.

Failed!

Failed=20 Jobs:
JobId   Alias   Feature Message=20 Outputs
job_201110042009_0005  =20 A      =20 MAP_ONLY        Message: Job=20 failed!
hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019,

Inpu= t(s):
Failed=20 to read data from "/data/arpumsisdn.txt"

Output(s):
Failed to = produce=20 result in=20 "hdfs://
10.0.0.61/tmp/temp1751671187/tmp-592386019"

Counters:<= BR>Total=20 records written : 0
Total bytes written : 0
Spillable Memory = Manager spill=20 count : 0
Total bags proactively spilled: 0
Total records = proactively=20 spilled: 0

Job = DAG:
job_201110042009_0005


2011-10-04=20 22:13:53,736 [main]=20 INFO
 
org.apache.pig.backend.hadoop.executionengine.mapReduce= Layer.MapReduceLauncher
-=20 Failed!
2011-10-04 22:13:53,745 [main] ERROR = org.apache.pig.tools.grunt.Grunt=20 -
ERROR 1066: Unable to open iterator for alias A. Backend error : = Could=20
not
obtain block: blk_-8354424441116992221_1060=20 file=3D/data/arpumsisdn.txt
Details at logfile:=20 /home/kiranprasad.g/pig-0.8.1/pig_1317746514798.log
 
 
Regards
Kiran.G ------=_NextPart_000_02B8_01CC84FB.A7E963D0--