hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kim Vogt <...@simplegeo.com>
Subject MapFiles error "Could not obtain block"
Date Thu, 18 Nov 2010 20:45:12 GMT
Hi,

I'm using the MapFileOutputFormat to lookup values in MapFiles and keep
getting "Could not obtain block" errors.  I'm thinking it might be because
ulimit is not set high enough.  Has anyone else run into this issue?

attempt_201011180019_0005_m_000003_0: Caught exception while getting cached
files: java.io.IOException: Could not obtain block:
blk_-7027776556206952935_61338 file=/mydata/part-r-00000/data
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1976)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1783)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1931)
attempt_201011180019_0005_m_000003_0:     at
java.io.DataInputStream.readFully(DataInputStream.java:178)
attempt_201011180019_0005_m_000003_0:     at
java.io.DataInputStream.readFully(DataInputStream.java:152)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1457)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1435)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1424)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1419)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.MapFile$Reader.createDataFileReader(MapFile.java:302)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.MapFile$Reader.open(MapFile.java:284)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.MapFile$Reader.<init>(MapFile.java:273)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.MapFile$Reader.<init>(MapFile.java:260)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.io.MapFile$Reader.<init>(MapFile.java:253)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:639)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.mapred.MapTask.run(MapTask.java:315)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.mapred.Child$4.run(Child.java:217)
attempt_201011180019_0005_m_000003_0:     at
java.security.AccessController.doPrivileged(Native Method)
attempt_201011180019_0005_m_000003_0:     at
javax.security.auth.Subject.doAs(Subject.java:396)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1063)
attempt_201011180019_0005_m_000003_0:     at
org.apache.hadoop.mapred.Child.main(Child.java:211)

-Kim

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message