hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From yeshwanth kumar <yeshwant...@gmail.com>
Subject Re: java.util.concurrent.ExecutionException
Date Tue, 02 Sep 2014 17:36:29 GMT
hi ted,

configuration is gud, i got couple of mapreduce jobs on hbase running
without such issue,
going through logs of the job, i noticed that after processing some rows
this exception shows up

-yeshwanth


On Tue, Sep 2, 2014 at 10:59 PM, Ted Yu <yuzhihong@gmail.com> wrote:

> bq. Call to localhost/127.0.0.1:60020 failed
>
> Can you check whether configuration from hbase-site.xml is correctly passed
> to your mapper ?
>
> Cheers
>
>
> On Tue, Sep 2, 2014 at 10:25 AM, yeshwanth kumar <yeshwanth43@gmail.com>
> wrote:
>
> > hi i am running HBase 0.94.20 on Hadoop 2.2.0
> >
> > i am working on a mapreduce job, where it reads input from a table and
> > writes the processed back to that table and to another table,
> > i am using MultiTableOutputFormat class for that.
> >
> > while running the mapreduce job, i encounter this exception, as a result
> > regionserver is crashing.
> >
> > 2014-09-02 07:56:47,790 WARN [main]
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
> > Failed all from
> > region=crawl_webpage,,1408774462347.a311e4aed343baf54f49ac6519d0bbe8.,
> > hostname=localhost, port=60020
> > java.util.concurrent.ExecutionException: java.io.IOException: Call to
> > localhost/127.0.0.1:60020 failed on local exception:
> java.io.EOFException
> > at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > at java.util.concurrent.FutureTask.get(FutureTask.java:188)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1708)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1560)
> > at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:994)
> > at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:850)
> > at org.apache.hadoop.hbase.client.HTable.put(HTable.java:826)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:132)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
> > at
> >
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:634)
> > at
> >
> >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > at
> >
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > at
> >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:119)
> > at
> >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:33)
> > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
> > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:415)
> > at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > Caused by: java.io.IOException: Call to localhost/127.0.0.1:60020 failed
> > on
> > local exception: java.io.EOFException
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:1047)
> > at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1016)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:87)
> > at com.sun.proxy.$Proxy12.multi(Unknown Source)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1537)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1535)
> > at
> >
> >
> org.apache.hadoop.hbase.client.ServerCallable.withoutRetries(ServerCallable.java:229)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1544)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1532)
> > at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.EOFException
> > at java.io.DataInputStream.readInt(DataInputStream.java:392)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:654)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
> >
> > any suggestions, for overcoming this issue,
> >
> > thanks,
> > yeshwanth
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message