Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 3B6F8105A5 for ; Sat, 28 Feb 2015 15:20:52 +0000 (UTC) Received: (qmail 47515 invoked by uid 500); 28 Feb 2015 15:20:45 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 47418 invoked by uid 500); 28 Feb 2015 15:20:45 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 47408 invoked by uid 99); 28 Feb 2015 15:20:44 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 28 Feb 2015 15:20:44 +0000 X-ASF-Spam-Status: No, hits=2.5 required=5.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of yuzhihong@gmail.com designates 209.85.223.170 as permitted sender) Received: from [209.85.223.170] (HELO mail-ie0-f170.google.com) (209.85.223.170) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 28 Feb 2015 15:20:19 +0000 Received: by iecrp18 with SMTP id rp18so37952921iec.9 for ; Sat, 28 Feb 2015 07:20:17 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type; bh=VNJKQ1CQC0DATrGnsvtCt8QOhKgdnw79RP6ZqPKmcI4=; b=r3Rf3aCSa+9PpQdrGYEJBXAf0K4rxFnkBAy9Fizvf4Td8CSPza6vGqPoPjxiWcVjlT xP7TwevXM491jKdRo4/K6GtVY3SESWKm2CTbt6jHiWp7mWLfkdDLFyiv6RtWSj5yuMnd RPRazt26bA2hh9BxJra/+S1Adu83e5RiJjBjICIjhWk77MYmcCHF7PjtuDVSajU7wyfQ Azj3dBXhvGcVq9KFi4sJyxFvWDsDkc9B8EDStSxqNyhLG0UUGZHjo625lvaILm+NGK9w uRvJCwjTePtr/cRloCVnQPMyFlIWb48R/i11SaorHkvMZeNwpbW9DNR+iJJrdWfEXYcC 4Wug== MIME-Version: 1.0 X-Received: by 10.42.28.199 with SMTP id o7mr21569558icc.23.1425136816930; Sat, 28 Feb 2015 07:20:16 -0800 (PST) Received: by 10.36.53.82 with HTTP; Sat, 28 Feb 2015 07:20:16 -0800 (PST) In-Reply-To: <00fc01d05256$e4174620$ac45d260$@visolve.com> References: <00fc01d05256$e4174620$ac45d260$@visolve.com> Date: Sat, 28 Feb 2015 07:20:16 -0800 Message-ID: Subject: Re: Intermittent BindException during long MR jobs From: Ted Yu To: "common-user@hadoop.apache.org" Cc: krishnanjrao@gmail.com, Hive Content-Type: multipart/alternative; boundary=20cf304271b2bac7000510278595 X-Virus-Checked: Checked by ClamAV on apache.org --20cf304271b2bac7000510278595 Content-Type: text/plain; charset=UTF-8 Krishna: Please take a look at: http://wiki.apache.org/hadoop/BindException Cheers On Thu, Feb 26, 2015 at 10:30 PM, wrote: > Hello Krishna, > > > > Exception seems to be IP specific. It might be occurred due to > unavailability of IP address in the system to assign. Double check the IP > address availability and run the job. > > > > *Thanks,* > > *S.RagavendraGanesh* > > ViSolve Hadoop Support Team > ViSolve Inc. | San Jose, California > Website: www.visolve.com > > email: services@visolve.com | Phone: 408-850-2243 > > > > > > *From:* Krishna Rao [mailto:krishnanjrao@gmail.com] > *Sent:* Thursday, February 26, 2015 9:48 PM > *To:* user@hive.apache.org; user@hadoop.apache.org > *Subject:* Intermittent BindException during long MR jobs > > > > Hi, > > > > we occasionally run into a BindException causing long running jobs to > occasionally fail. > > > > The stacktrace is below. > > > > Any ideas what this could be caused by? > > > > Cheers, > > > > Krishna > > > > > > Stacktrace: > > 379969 [Thread-980] ERROR org.apache.hadoop.hive.ql.exec.Task - Job > Submission failed with exception 'java.net.BindException(Problem binding to > [back10/10.4.2.10:0] java.net.BindException: Cann > > ot assign requested address; For more details see: > http://wiki.apache.org/hadoop/BindException)' > > java.net.BindException: Problem binding to [back10/10.4.2.10:0] > java.net.BindException: Cannot assign requested address; For more details > see: http://wiki.apache.org/hadoop/BindException > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:718) > > at org.apache.hadoop.ipc.Client.call(Client.java:1242) > > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) > > at com.sun.proxy.$Proxy10.create(Unknown Source) > > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:193) > > at sun.reflect.GeneratedMethodAccessor43.invoke(Unknown Source) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) > > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) > > at com.sun.proxy.$Proxy11.create(Unknown Source) > > at > org.apache.hadoop.hdfs.DFSOutputStream.(DFSOutputStream.java:1376) > > at > org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1395) > > at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1255) > > at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1212) > > at > org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:276) > > at > org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:265) > > at > org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:82) > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:888) > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:869) > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:768) > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:757) > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:558) > > at > org.apache.hadoop.mapreduce.split.JobSplitWriter.createFile(JobSplitWriter.java:96) > > at > org.apache.hadoop.mapreduce.split.JobSplitWriter.createSplitFiles(JobSplitWriter.java:85) > > at > org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:517) > > at > org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:487) > > at > org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:369) > > at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1286) > > at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1283) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438) > > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1283) > > at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:606) > > at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:601) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438) > > at > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:601) > > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:586) > > at > org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:448) > > at > org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138) > > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138) > > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:66) > > at > org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:56) > > > --20cf304271b2bac7000510278595 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Krishna:

Cheers

On Thu, Feb 26, 2015 at = 10:30 PM, <hadoop.support@visolve.com> wrote:
<= blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-left:1px= #ccc solid;padding-left:1ex">

Hello Krishna,

=C2=A0

Exception seems to be IP speci= fic. It might be occurred due to unavailability of IP address in the system= to assign. Double check the IP address availability and run the job.

=C2=A0<= /u>

Thanks,

S.RagavendraGanesh

ViSolve Hadoop Support Team
ViSolve Inc. | San Jose, Califo= rnia
Website: www.v= isolve.com

email: services@visolve.com | Phon= e: 40= 8-850-2243

=C2=A0

=C2=A0

From: Krishna Rao [mailto:krishnanjrao@gmail.= com]
Sent: Thursday, February 26, 2015 9:48 PM
To:= user@hive.apache= .org; user@= hadoop.apache.org
Subject: Intermittent BindException during = long MR jobs

=C2=A0

Hi,

=C2=A0

we occ= asionally run into a BindException causing long running jobs to occasionall= y fail.

=C2=A0=

The stacktrace is below.

=C2=A0

Any ideas what this could be caused by?=

=C2=A0

Cheers,

=C2=A0

Krishna<= u>

=C2=A0

=

=C2=A0

Stacktrace:

379969 [Thread-980] ERROR org.apache.hadoop.hive.ql.exec.Task =C2= =A0- Job Submission failed with exception 'java.net.BindException(Probl= em binding to [back10/10.4= .2.10:0] java.net.BindException: Cann

ot assign requested address; For more details see: =C2=A0= h= ttp://wiki.apache.org/hadoop/BindException)'

java.net.BindException: Problem binding to [ba= ck10/10.4.2.10:0] java= .net.BindException: Cannot assign requested address; For more details see: = =C2=A0http://wiki.apache.org/hadoop/BindException

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hado= op.net.NetUtils.wrapException(NetUtils.java:718)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.i= pc.Client.call(Client.java:1242)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.ipc.ProtobufRpcEn= gine$Invoker.invoke(ProtobufRpcEngine.java:202)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at com.sun.proxy.$Proxy= 10.create(Unknown Source)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNa= menodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:1= 93)

=C2=A0 =C2=A0 =C2=A0= =C2=A0 at sun.reflect.GeneratedMethodAccessor43.invoke(Unknown Source)<= /u>

=C2=A0 =C2=A0 =C2=A0 =C2=A0= at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso= rImpl.java:25)

=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at java.lang.reflect.Method.invoke(Method.java:597)=

=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(Retry= InvocationHandler.java:164)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.io.retry.RetryInvocati= onHandler.invoke(RetryInvocationHandler.java:83)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at com.sun.proxy.$Prox= y11.create(Unknown Source)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.DFSOutputStream.&l= t;init>(DFSOutputStream.java:1376)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.DFSOut= putStream.newStreamForCreate(DFSOutputStream.java:1395)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.h= adoop.hdfs.DFSClient.create(DFSClient.java:1255)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.h= dfs.DFSClient.create(DFSClient.java:1212)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.Dis= tributedFileSystem.create(DistributedFileSystem.java:276)

=

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache= .hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:265)

=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileS= ystem.java:82)

=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.fs.FileSystem.create(FileSystem.j= ava:888)

=C2=A0 =C2=A0 = =C2=A0 =C2=A0 at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:869= )

=C2=A0 =C2=A0 =C2=A0 = =C2=A0 at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:768)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 a= t org.apache.hadoop.fs.FileSystem.create(FileSystem.java:757)=

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.ap= ache.hadoop.fs.FileSystem.create(FileSystem.java:558)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.had= oop.mapreduce.split.JobSplitWriter.createFile(JobSplitWriter.java:96)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 a= t org.apache.hadoop.mapreduce.split.JobSplitWriter.createSplitFiles(JobSpli= tWriter.java:85)

=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSp= lits(JobSubmitter.java:517)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.JobSubmitter= .writeSplits(JobSubmitter.java:487)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.JobS= ubmitter.submitJobInternal(JobSubmitter.java:369)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.= mapreduce.Job$11.run(Job.java:1286)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.Job$= 11.run(Job.java:1283)

= =C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.security.AccessController.doPrivileged(= Native Method)

=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at javax.security.auth.Subject.doAs(Subject.java:396)<= u>

=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform= ation.java:1438)

=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.Job.submit(Job.java:128= 3)

=C2=A0 =C2=A0 =C2=A0 = =C2=A0 at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:606)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 = at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:601)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at java= .security.AccessController.doPrivileged(Native Method)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at javax.securit= y.auth.Subject.doAs(Subject.java:396)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.security.Us= erGroupInformation.doAs(UserGroupInformation.java:1438)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.h= adoop.mapred.JobClient.submitJobInternal(JobClient.java:601)<= /p>

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apa= che.hadoop.mapred.JobClient.submitJob(JobClient.java:586)

=

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache= .hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:448)<= /p>

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apa= che.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:138)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.= apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apach= e.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:66)

=C2=A0 =C2=A0 =C2=A0 =C2=A0 at o= rg.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:56)<= /u>

=C2=A0


--20cf304271b2bac7000510278595--