Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 44C6910BD8 for ; Fri, 23 Aug 2013 10:26:07 +0000 (UTC) Received: (qmail 56534 invoked by uid 500); 23 Aug 2013 10:26:00 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 56331 invoked by uid 500); 23 Aug 2013 10:25:59 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 56318 invoked by uid 99); 23 Aug 2013 10:25:59 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 23 Aug 2013 10:25:59 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS,UNPARSEABLE_RELAY X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: local policy) Received: from [216.82.254.106] (HELO mail1.bemta7.messagelabs.com) (216.82.254.106) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 23 Aug 2013 10:25:54 +0000 Received: from [216.82.254.19:17803] by server-10.bemta-7.messagelabs.com id 76/99-06836-29837125; Fri, 23 Aug 2013 10:25:22 +0000 X-Env-Sender: Omkar.Joshi@lntinfotech.com X-Msg-Ref: server-14.tower-96.messagelabs.com!1377253273!520848!9 X-Originating-IP: [203.199.118.205] X-StarScan-Received: X-StarScan-Version: 6.9.11; banners=-,-,- X-VirusChecked: Checked Received: (qmail 4220 invoked from network); 23 Aug 2013 10:25:22 -0000 Received: from unknown (HELO VSHINMSHTCAS01.vshodc.lntinfotech.com) (203.199.118.205) by server-14.tower-96.messagelabs.com with AES128-SHA encrypted SMTP; 23 Aug 2013 10:25:22 -0000 Received: from vshinmsmbx01.vshodc.lntinfotech.com ([172.17.24.117]) by VSHINMSHTCAS01.vshodc.lntinfotech.com ([172.17.24.112]) with mapi; Fri, 23 Aug 2013 15:54:42 +0530 From: Omkar Joshi To: "user@hadoop.apache.org" Date: Fri, 23 Aug 2013 15:54:38 +0530 Subject: RE: Hadoop - impersonation doubts/issues while accessing from remote machine Thread-Topic: Hadoop - impersonation doubts/issues while accessing from remote machine Thread-Index: Ac6f6rFXF5aAcljGTgypFUkPp9eRAgAAD//w Message-ID: <228082AE2B2E784180B63E1D92E4EB870DE0B680AB@VSHINMSMBX01.vshodc.lntinfotech.com> References: <228082AE2B2E784180B63E1D92E4EB870DE0B67B3F@VSHINMSMBX01.vshodc.lntinfotech.com> In-Reply-To: Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: acceptlanguage: en-US Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 X-Virus-Checked: Checked by ClamAV on apache.org Thanks :) Regards, Omkar Joshi -----Original Message----- From: Harsh J [mailto:harsh@cloudera.com]=20 Sent: Friday, August 23, 2013 3:52 PM To: Subject: Re: Hadoop - impersonation doubts/issues while accessing from remo= te machine I've answered this on the stackoverflow link: http://stackoverflow.com/questions/18354664/spring-data-hadoop-connectivity On Thu, Aug 22, 2013 at 1:29 PM, Omkar Joshi wrote: > For readability, I haven't posted the code, output etc. in this mail - > please check the thread below : > > > > http://stackoverflow.com/questions/18354664/spring-data-hadoop-connectivi= ty > > > > I'm trying to connect to a remote hadoop(1.1.2) cluster from my local > Windows machine via Spring data(later, eclipse plug-in may also be used).= In > future, multiple such connections from several Windows machines are > expected. > > > > On my remote(single-node) cluster, bigdata is the user for Hadoop etc. > > bigdata@cloudx-843-770:~$ groups bigdata > > bigdata : bigdata > > On my local Windows machine > > D:\>echo %username% > > 298790 > > D:\>hostname > > INFVA03351 > > > > Now if I refer to Hadoop Secure Impersonation., does it mean I need to > create a user 298790 on the cluster, add the hostname in core-site.xml et= c. > ??? Any less-cumbersome ways out? I tried that too on the cluster but the > (partial given)output error still persists : > > > > Aug 22, 2013 12:29:20 PM > org.springframework.context.support.AbstractApplicationContext > prepareRefresh > > INFO: Refreshing > org.springframework.context.support.ClassPathXmlApplicationContext@181533= 8: > startup date [Thu Aug 22 12:29:20 IST 2013]; root of context hierarchy > > Aug 22, 2013 12:29:20 PM > org.springframework.beans.factory.xml.XmlBeanDefinitionReader > loadBeanDefinitions > > INFO: Loading XML bean definitions from class path resource > [com/hadoop/basics/applicationContext.xml] > > Aug 22, 2013 12:29:20 PM > org.springframework.core.io.support.PropertiesLoaderSupport loadPropertie= s > > INFO: Loading properties file from class path resource > [resources/hadoop.properties] > > Aug 22, 2013 12:29:20 PM > org.springframework.beans.factory.support.DefaultListableBeanFactory > preInstantiateSingletons > > INFO: Pre-instantiating singletons in > org.springframework.beans.factory.support.DefaultListableBeanFactory@7c19= 7e: > defining beans > [org.springframework.context.support.PropertySourcesPlaceholderConfigurer= #0,hadoopConfiguration,wc-job,myjobs-runner,resourceLoader]; > root of factory hierarchy > > Aug 22, 2013 12:29:21 PM > org.springframework.data.hadoop.mapreduce.JobExecutor$2 run > > INFO: Starting job [wc-job] > > Aug 22, 2013 12:29:21 PM org.apache.hadoop.security.UserGroupInformation > doAs > > SEVERE: PriviledgedActionException as:bigdata via 298790 > cause:org.apache.hadoop.ipc.RemoteException: User: 298790 is not allowed = to > impersonate bigdata > > Aug 22, 2013 12:29:21 PM > org.springframework.data.hadoop.mapreduce.JobExecutor$2 run > > WARNING: Cannot start job [wc-job] > > org.apache.hadoop.ipc.RemoteException: User: 298790 is not allowed to > impersonate bigdata > > at org.apache.hadoop.ipc.Client.call(Client.java:1107) > > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) > > at org.apache.hadoop.mapred.$Proxy2.getProtocolVersion(Unknown Sour= ce) > > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:411) > > at > org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:499) > > at org.apache.hadoop.mapred.JobClient.init(JobClient.java:490) > > at org.apache.hadoop.mapred.JobClient.(JobClient.java:473) > > at org.apache.hadoop.mapreduce.Job$1.run(Job.java:513) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Unknown Source) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1149) > > at org.apache.hadoop.mapreduce.Job.connect(Job.java:511) > > at org.apache.hadoop.mapreduce.Job.submit(Job.java:499) > > at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530) > > at > org.springframework.data.hadoop.mapreduce.JobExecutor$2.run(JobExecutor.j= ava:197) > > at > org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.j= ava:49) > > at > org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecut= or.java:168) > > at > org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecut= or.java:160) > > at > org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:5= 2) > > at > org.springframework.data.hadoop.mapreduce.JobRunner.afterPropertiesSet(Jo= bRunner.java:44) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1541) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.initializeBean(AbstractAutowireCapableBeanFactory.java:1479) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.createBean(AbstractAutowireCapableBeanFactory.java:458) > > at > org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject= (AbstractBeanFactory.java:295) > > at > org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.ge= tSingleton(DefaultSingletonBeanRegistry.java:223) > > at > org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(A= bstractBeanFactory.java:292) > > at > org.springframework.beans.factory.support.AbstractBeanFactory.getBean(Abs= tractBeanFactory.java:194) > > at > org.springframework.beans.factory.support.DefaultListableBeanFactory.preI= nstantiateSingletons(DefaultListableBeanFactory.java:628) > > at > org.springframework.context.support.AbstractApplicationContext.finishBean= FactoryInitialization(AbstractApplicationContext.java:932) > > at > org.springframework.context.support.AbstractApplicationContext.refresh(Ab= stractApplicationContext.java:479) > > at > org.springframework.context.support.ClassPathXmlApplicationContext.= (ClassPathXmlApplicationContext.java:197) > > at > org.springframework.context.support.ClassPathXmlApplicationContext.= (ClassPathXmlApplicationContext.java:172) > > at > org.springframework.context.support.ClassPathXmlApplicationContext.= (ClassPathXmlApplicationContext.java:158) > > at com.hadoop.basics.WordCounter.main(WordCounter.java:58) > > > > Regards, > > Omkar Joshi > > > > > ________________________________ > The contents of this e-mail and any attachment(s) may contain confidentia= l > or privileged information for the intended recipient(s). Unintended > recipients are prohibited from taking action on the basis of information = in > this e-mail and using or disseminating the information, and must notify t= he > sender and delete it from their system. L&T Infotech will not accept > responsibility or liability for the accuracy or completeness of, or the > presence of any virus or disabling code in this e-mail" --=20 Harsh J