hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Hadoop - impersonation doubts/issues while accessing from remote machine
Date Fri, 23 Aug 2013 10:21:43 GMT
I've answered this on the stackoverflow link:
http://stackoverflow.com/questions/18354664/spring-data-hadoop-connectivity

On Thu, Aug 22, 2013 at 1:29 PM, Omkar Joshi
<Omkar.Joshi@lntinfotech.com> wrote:
> For readability, I haven’t posted the code, output etc. in this mail –
> please check the thread below :
>
>
>
> http://stackoverflow.com/questions/18354664/spring-data-hadoop-connectivity
>
>
>
> I'm trying to connect to a remote hadoop(1.1.2) cluster from my local
> Windows machine via Spring data(later, eclipse plug-in may also be used). In
> future, multiple such connections from several Windows machines are
> expected.
>
>
>
> On my remote(single-node) cluster, bigdata is the user for Hadoop etc.
>
> bigdata@cloudx-843-770:~$ groups bigdata
>
> bigdata : bigdata
>
> On my local Windows machine
>
> D:\>echo %username%
>
> 298790
>
> D:\>hostname
>
> INFVA03351
>
>
>
> Now if I refer to Hadoop Secure Impersonation., does it mean I need to
> create a user 298790 on the cluster, add the hostname in core-site.xml etc.
> ??? Any less-cumbersome ways out? I tried that too on the cluster but the
> (partial given)output error still persists :
>
>
>
> Aug 22, 2013 12:29:20 PM
> org.springframework.context.support.AbstractApplicationContext
> prepareRefresh
>
> INFO: Refreshing
> org.springframework.context.support.ClassPathXmlApplicationContext@1815338:
> startup date [Thu Aug 22 12:29:20 IST 2013]; root of context hierarchy
>
> Aug 22, 2013 12:29:20 PM
> org.springframework.beans.factory.xml.XmlBeanDefinitionReader
> loadBeanDefinitions
>
> INFO: Loading XML bean definitions from class path resource
> [com/hadoop/basics/applicationContext.xml]
>
> Aug 22, 2013 12:29:20 PM
> org.springframework.core.io.support.PropertiesLoaderSupport loadProperties
>
> INFO: Loading properties file from class path resource
> [resources/hadoop.properties]
>
> Aug 22, 2013 12:29:20 PM
> org.springframework.beans.factory.support.DefaultListableBeanFactory
> preInstantiateSingletons
>
> INFO: Pre-instantiating singletons in
> org.springframework.beans.factory.support.DefaultListableBeanFactory@7c197e:
> defining beans
> [org.springframework.context.support.PropertySourcesPlaceholderConfigurer#0,hadoopConfiguration,wc-job,myjobs-runner,resourceLoader];
> root of factory hierarchy
>
> Aug 22, 2013 12:29:21 PM
> org.springframework.data.hadoop.mapreduce.JobExecutor$2 run
>
> INFO: Starting job [wc-job]
>
> Aug 22, 2013 12:29:21 PM org.apache.hadoop.security.UserGroupInformation
> doAs
>
> SEVERE: PriviledgedActionException as:bigdata via 298790
> cause:org.apache.hadoop.ipc.RemoteException: User: 298790 is not allowed to
> impersonate bigdata
>
> Aug 22, 2013 12:29:21 PM
> org.springframework.data.hadoop.mapreduce.JobExecutor$2 run
>
> WARNING: Cannot start job [wc-job]
>
> org.apache.hadoop.ipc.RemoteException: User: 298790 is not allowed to
> impersonate bigdata
>
>       at org.apache.hadoop.ipc.Client.call(Client.java:1107)
>
>       at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
>
>       at org.apache.hadoop.mapred.$Proxy2.getProtocolVersion(Unknown Source)
>
>       at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:411)
>
>       at
> org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:499)
>
>       at org.apache.hadoop.mapred.JobClient.init(JobClient.java:490)
>
>       at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:473)
>
>       at org.apache.hadoop.mapreduce.Job$1.run(Job.java:513)
>
>       at java.security.AccessController.doPrivileged(Native Method)
>
>       at javax.security.auth.Subject.doAs(Unknown Source)
>
>       at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>
>       at org.apache.hadoop.mapreduce.Job.connect(Job.java:511)
>
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:499)
>
>       at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
>
>       at
> org.springframework.data.hadoop.mapreduce.JobExecutor$2.run(JobExecutor.java:197)
>
>       at
> org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:49)
>
>       at
> org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecutor.java:168)
>
>       at
> org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecutor.java:160)
>
>       at
> org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:52)
>
>       at
> org.springframework.data.hadoop.mapreduce.JobRunner.afterPropertiesSet(JobRunner.java:44)
>
>       at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1541)
>
>       at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1479)
>
>       at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521)
>
>       at
> org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:458)
>
>       at
> org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:295)
>
>       at
> org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:223)
>
>       at
> org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:292)
>
>       at
> org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
>
>       at
> org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:628)
>
>       at
> org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:932)
>
>       at
> org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:479)
>
>       at
> org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:197)
>
>       at
> org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:172)
>
>       at
> org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:158)
>
>       at com.hadoop.basics.WordCounter.main(WordCounter.java:58)
>
>
>
> Regards,
>
> Omkar Joshi
>
>
>
>
> ________________________________
> The contents of this e-mail and any attachment(s) may contain confidential
> or privileged information for the intended recipient(s). Unintended
> recipients are prohibited from taking action on the basis of information in
> this e-mail and using or disseminating the information, and must notify the
> sender and delete it from their system. L&T Infotech will not accept
> responsibility or liability for the accuracy or completeness of, or the
> presence of any virus or disabling code in this e-mail"



-- 
Harsh J

Mime
View raw message