Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 866221753F for ; Wed, 5 Nov 2014 11:42:26 +0000 (UTC) Received: (qmail 14973 invoked by uid 500); 5 Nov 2014 11:42:15 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 14866 invoked by uid 500); 5 Nov 2014 11:42:15 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 14856 invoked by uid 99); 5 Nov 2014 11:42:15 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 05 Nov 2014 11:42:15 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW X-Spam-Check-By: apache.org Received-SPF: unknown (nike.apache.org: error in processing during lookup of jagannath.naidu@fosteringlinux.com) Received: from [209.85.216.44] (HELO mail-qa0-f44.google.com) (209.85.216.44) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 05 Nov 2014 11:41:48 +0000 Received: by mail-qa0-f44.google.com with SMTP id w8so321458qac.31 for ; Wed, 05 Nov 2014 03:41:01 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:content-type; bh=OtNmt9Vuj4OI4TFDEbtQOIwP/v/RzUdUzRoBrXu5m0o=; b=GzbPR4brI3vkEnsfjVC0TaQpLrBxgJZsfJq2zz39b3ftKu69s+A+m1e70xl6yNRWxN HHX7uijxDWH2SXMJsaS/iaMbIq9hogo4lKipWJlzDrsJj64yXMWmrfJb8dHgp91E0d0C L5fueOEiddAJ+Sm+R1daK0WjFzBUdFK5l469oP6rMu24Nbwr65PM3h1omc+EtfCpo36N Z5HOISAWfjdpfPAUM2IAG+yf4Pp2YbWBr4NiAaPVT0y3wdcXmJrCww9K22Pv7UzObiJJ QuIvwqLNGCcHcbvz+4N4/FNIJdK4Zo5JgYn2CiXryFfDpswYhVLXUXkvsJOS13cid8ky dXCQ== X-Gm-Message-State: ALoCoQnmmRpwVLvLtwEnA+EbLPL9JjTj0mXP3Hp818+KVgzq5OlPioxUvQq1AVfUZFbIV0ZkX3ZX MIME-Version: 1.0 X-Received: by 10.140.27.194 with SMTP id 60mr24772708qgx.57.1415187661566; Wed, 05 Nov 2014 03:41:01 -0800 (PST) Received: by 10.96.4.5 with HTTP; Wed, 5 Nov 2014 03:41:01 -0800 (PST) In-Reply-To: References: Date: Wed, 5 Nov 2014 17:11:01 +0530 Message-ID: Subject: Re: issue about pig can not know HDFS HA configuration From: Jagannath Naidu To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=001a11c13068dbd41e05071b0dd7 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c13068dbd41e05071b0dd7 Content-Type: text/plain; charset=UTF-8 On 5 November 2014 14:49, ch huang wrote: > hi,maillist: > i set namenode HA in my HDFS cluster,but seems pig can not know it ,why? > > 2014-11-05 14:34:54,710 [JobControl] INFO > org.apache.hadoop.mapreduce.JobSubmitter - Cleaning up the staging area > file:/tmp/hadoop-root/mapred/staging/root1861403840/.staging/job_local1861403840_0001 > 2014-11-05 14:34:54,716 [JobControl] WARN > org.apache.hadoop.security.UserGroupInformation - > PriviledgedActionException as:root (auth:SIMPLE) > cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: > java.net.UnknownHostException: develop > unknown host exception, this can be the issue. Check that the host is discoverable either form dns or from hosts. > 2014-11-05 14:34:54,717 [JobControl] INFO > org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob - > PigLatin:DefaultJobName got an error while submitting > org.apache.pig.backend.executionengine.ExecException: ERROR 2118: > java.net.UnknownHostException: develop > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288) > at > org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) > at > org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) > at > org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295) > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554) > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292) > at > org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128) > at > org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191) > at java.lang.Thread.run(Thread.java:744) > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270) > Caused by: java.lang.IllegalArgumentException: > java.net.UnknownHostException: develop > at > org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377) > at > org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:237) > at > org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141) > at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:576) > at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:521) > at > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:146) > at > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89) > at > org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368) > at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) > at > org.apache.hcatalog.mapreduce.HCatBaseInputFormat.setInputPath(HCatBaseInputFormat.java:326) > at > org.apache.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:127) > at > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274) > ... 18 more > Caused by: java.net.UnknownHostException: develop > ... 33 more > > -- Jaggu Naidu --001a11c13068dbd41e05071b0dd7 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable


On 5 November 2014 14:49, ch huang <justlooks@gmail.com> wrote:
hi,mailli= st:
=C2=A0 =C2=A0i set namenode HA in my HDFS cluster,but seems p= ig can not know it ,why?

2014-11-05 14:34:54,= 710 [JobControl] INFO =C2=A0org.apache.hadoop.mapreduce.JobSubmitter - Clea= ning up the staging area file:/tmp/hadoop-root/mapred/staging/root186140384= 0/.staging/job_local1861403840_0001
2014-11-05 14:34:54,716 [JobC= ontrol] WARN =C2=A0org.apache.hadoop.security.UserGroupInformation - Privil= edgedActionException as:root (auth:SIMPLE) cause:org.apache.pig.backend.exe= cutionengine.ExecException: ERROR 2118: java.net.UnknownHostException: deve= lop

unknown host exceptio= n, this can be the issue. Check that the host is discoverable either form d= ns or from hosts.
=C2=A0
2014-11-05 14:34:54,717 [JobControl]= INFO =C2=A0org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob - PigL= atin:DefaultJobName got an error while submitting
org.apache.pig.= backend.executionengine.ExecException: ERROR 2118: java.net.UnknownHostExce= ption: develop
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.pig.back= end.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInput= Format.java:288)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop= .mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)
=C2= =A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.JobSubmitter.writeS= plits(JobSubmitter.java:510)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.a= pache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394= )
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.Job$= 10.run(Job.java:1295)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.h= adoop.mapreduce.Job$10.run(Job.java:1292)
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 at java.security.AccessController.doPrivileged(Native Method)
<= div>=C2=A0 =C2=A0 =C2=A0 =C2=A0 at javax.security.auth.Subject.doAs(Subject= .java:415)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.secur= ity.UserGroupInformation.doAs(UserGroupInformation.java:1554)
=C2= =A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce.Job.submit(Job.java= :1292)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.mapreduce= .lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
=C2= =A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Na= tive Method)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.NativeMet= hodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Del= egatingMethodAccessorImpl.java:43)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at= java.lang.reflect.Method.invoke(Method.java:606)
=C2=A0 =C2=A0 = =C2=A0 =C2=A0 at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJo= bControl.java:128)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.pig.= backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
=C2=A0= =C2=A0 =C2=A0 =C2=A0 at java.lang.Thread.run(Thread.java:744)
= =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.pig.backend.hadoop.executionengin= e.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
<= div>Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostExce= ption: develop
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.s= ecurity.SecurityUtil.buildTokenService(SecurityUtil.java:377)
=C2= =A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.NameNodeProxies.createNo= nHAProxy(NameNodeProxies.java:237)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at= org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:14= 1)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.DFSClien= t.<init>(DFSClient.java:576)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at= org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:521)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.hdfs.DistributedFileSyst= em.initialize(DistributedFileSystem.java:146)
=C2=A0 =C2=A0 =C2= =A0 =C2=A0 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.j= ava:2397)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.fs.Fil= eSystem.access$200(FileSystem.java:89)
=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:24= 31)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.fs.FileSyste= m$Cache.get(FileSystem.java:2413)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at = org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.hadoop.fs.Path.getFileSystem(Path.java:2= 96)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hcatalog.mapreduce.= HCatBaseInputFormat.setInputPath(HCatBaseInputFormat.java:326)
= =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.hcatalog.mapreduce.HCatBaseInputF= ormat.getSplits(HCatBaseInputFormat.java:127)
=C2=A0 =C2=A0 =C2= =A0 =C2=A0 at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.= PigInputFormat.getSplits(PigInputFormat.java:274)
=C2=A0 =C2=A0 = =C2=A0 =C2=A0 ... 18 more
Caused by: java.net.UnknownHostExceptio= n: develop
=C2=A0 =C2=A0 =C2=A0 =C2=A0 ... 33 more




--

Jaggu Naidu
--001a11c13068dbd41e05071b0dd7--