hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brahma Reddy Battula <brahmareddy.batt...@huawei.com>
Subject RE: issue about pig can not know HDFS HA configuration
Date Wed, 05 Nov 2014 13:27:51 GMT
Hello Jagannath,

Below exception will come when pigclient not able find the hdfs configuraions..you need to
do following..


Set the PIG_CLASSPATH environment variable to the location of the cluster configuration directory
(the directory that contains the core-site.xml, hdfs-site.xml and mapred-site.xml files):

  1.

export PIG_CLASSPATH=/mycluster/conf


  2.  Set the HADOOP_CONF_DIR environment variable to the location of the cluster configuration
directory:

export HADOOP_CONF_DIR=/mycluster/conf





Thanks & Regards

Brahma Reddy Battula




________________________________
From: Jagannath Naidu [jagannath.naidu@fosteringlinux.com]
Sent: Wednesday, November 05, 2014 5:11 PM
To: user@hadoop.apache.org
Subject: Re: issue about pig can not know HDFS HA configuration



On 5 November 2014 14:49, ch huang <justlooks@gmail.com<mailto:justlooks@gmail.com>>
wrote:
hi,maillist:
   i set namenode HA in my HDFS cluster,but seems pig can not know it ,why?

2014-11-05 14:34:54,710 [JobControl] INFO  org.apache.hadoop.mapreduce.JobSubmitter - Cleaning
up the staging area file:/tmp/hadoop-root/mapred/staging/root1861403840/.staging/job_local1861403840_0001
2014-11-05 14:34:54,716 [JobControl] WARN  org.apache.hadoop.security.UserGroupInformation
- PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.pig.backend.executionengine.ExecException:
ERROR 2118: java.net.UnknownHostException: develop

unknown host exception, this can be the issue. Check that the host is discoverable either
form dns or from hosts.

2014-11-05 14:34:54,717 [JobControl] INFO  org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob
- PigLatin:DefaultJobName got an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: java.net.UnknownHostException:
develop
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
        at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)
        at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
        at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
        at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
        at java.lang.Thread.run(Thread.java:744)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: develop
        at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
        at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:237)
        at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:576)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:521)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:146)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
        at org.apache.hcatalog.mapreduce.HCatBaseInputFormat.setInputPath(HCatBaseInputFormat.java:326)
        at org.apache.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:127)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
        ... 18 more
Caused by: java.net.UnknownHostException: develop
        ... 33 more




--

Jaggu Naidu

Mime
View raw message