hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <hadoop.supp...@visolve.com>
Subject RE: Intermittent BindException during long MR jobs
Date Fri, 27 Feb 2015 06:30:31 GMT
Hello Krishna,

 

Exception seems to be IP specific. It might be occurred due to unavailability of IP address
in the system to assign. Double check the IP address availability and run the job. 

 

Thanks,

S.RagavendraGanesh

ViSolve Hadoop Support Team
ViSolve Inc. | San Jose, California
Website: www.visolve.com <http://www.visolve.com>  

email: services@visolve.com <mailto:services@visolve.com>  | Phone: 408-850-2243

 

 

From: Krishna Rao [mailto:krishnanjrao@gmail.com] 
Sent: Thursday, February 26, 2015 9:48 PM
To: user@hive.apache.org; user@hadoop.apache.org
Subject: Intermittent BindException during long MR jobs

 

Hi,

 

we occasionally run into a BindException causing long running jobs to occasionally fail.

 

The stacktrace is below.

 

Any ideas what this could be caused by?

 

Cheers,

 

Krishna

 

 

Stacktrace:

379969 [Thread-980] ERROR org.apache.hadoop.hive.ql.exec.Task  - Job Submission failed with
exception 'java.net.BindException(Problem binding to [back10/10.4.2.10:0 <http://10.4.2.10:0>
] java.net.BindException: Cann

ot assign requested address; For more details see:  http://wiki.apache.org/hadoop/BindException)'

java.net.BindException: Problem binding to [back10/10.4.2.10:0 <http://10.4.2.10:0>
] java.net.BindException: Cannot assign requested address; For more details see:  http://wiki.apache.org/hadoop/BindException

        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:718)

        at org.apache.hadoop.ipc.Client.call(Client.java:1242)

        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)

        at com.sun.proxy.$Proxy10.create(Unknown Source)

        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:193)

        at sun.reflect.GeneratedMethodAccessor43.invoke(Unknown Source)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

        at java.lang.reflect.Method.invoke(Method.java:597)

        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)

        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)

        at com.sun.proxy.$Proxy11.create(Unknown Source)

        at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1376)

        at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1395)

        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1255)

        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1212)

        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:276)

        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:265)

        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:82)

        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:888)

        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:869)

        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:768)

        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:757)

        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:558)

        at org.apache.hadoop.mapreduce.split.JobSplitWriter.createFile(JobSplitWriter.java:96)

        at org.apache.hadoop.mapreduce.split.JobSplitWriter.createSplitFiles(JobSplitWriter.java:85)

        at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:517)

        at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:487)

        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:369)

        at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1286)

        at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1283)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)

        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1283)

        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:606)

        at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:601)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)

        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:601)

        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:586)

        at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:448)

        at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)

        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)

        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:66)

        at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:56)

 


Mime
View raw message