Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 137AA10BCF for ; Fri, 23 Aug 2013 10:22:35 +0000 (UTC) Received: (qmail 51590 invoked by uid 500); 23 Aug 2013 10:22:28 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 51075 invoked by uid 500); 23 Aug 2013 10:22:28 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 51068 invoked by uid 99); 23 Aug 2013 10:22:27 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 23 Aug 2013 10:22:27 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of harsh@cloudera.com designates 209.85.223.175 as permitted sender) Received: from [209.85.223.175] (HELO mail-ie0-f175.google.com) (209.85.223.175) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 23 Aug 2013 10:22:23 +0000 Received: by mail-ie0-f175.google.com with SMTP id u16so489096iet.6 for ; Fri, 23 Aug 2013 03:22:03 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:content-type:content-transfer-encoding; bh=mquk+re3W40cuzCcF41fvlck0UDy0Yv0jZC2pp5MjK4=; b=A42iJ6DnNqkgXrBrIzt3j6qmUFK78s3QlMs0sl4jxADYKsTMzr1bBE0+quBT+Q1STc 9/8XkPThC7SMLE17GVPT5fiOab6MRbL6MobxFUbFmPvgu7QKbkuBaabvGzTQ3qcVTj2c A7dEiLYpNBYu9prPa/eHnqxnZjCESUJDtmSpvoUtlDZo4iaqHiZb3PuJUxZ2IMUAo8i6 bAGvi0PSsonmjnQpb3+l5REv1zdQP94+MW2L1xVpCXm3zKahxxbzcQRS97d8ZQ/zMZsG iGKIZRR6SLgk5KU4i6JGgVnfaj59Vy2ugnlB4Ce3iqa3Axw6rGuyd+CtesXwaQyoAkWv TWcA== X-Gm-Message-State: ALoCoQmq6MiGhSlQxikjhMLoQG41a2b/QllHCZ7IuSMkaM3K+nAvSx+ySt2W2ycesjBNOoMYxW97 X-Received: by 10.42.249.199 with SMTP id ml7mr85573icb.52.1377253323133; Fri, 23 Aug 2013 03:22:03 -0700 (PDT) MIME-Version: 1.0 Received: by 10.50.101.202 with HTTP; Fri, 23 Aug 2013 03:21:43 -0700 (PDT) In-Reply-To: <228082AE2B2E784180B63E1D92E4EB870DE0B67B3F@VSHINMSMBX01.vshodc.lntinfotech.com> References: <228082AE2B2E784180B63E1D92E4EB870DE0B67B3F@VSHINMSMBX01.vshodc.lntinfotech.com> From: Harsh J Date: Fri, 23 Aug 2013 15:51:43 +0530 Message-ID: Subject: Re: Hadoop - impersonation doubts/issues while accessing from remote machine To: "" Content-Type: text/plain; charset=windows-1252 Content-Transfer-Encoding: quoted-printable X-Virus-Checked: Checked by ClamAV on apache.org I've answered this on the stackoverflow link: http://stackoverflow.com/questions/18354664/spring-data-hadoop-connectivity On Thu, Aug 22, 2013 at 1:29 PM, Omkar Joshi wrote: > For readability, I haven=92t posted the code, output etc. in this mail = =96 > please check the thread below : > > > > http://stackoverflow.com/questions/18354664/spring-data-hadoop-connectivi= ty > > > > I'm trying to connect to a remote hadoop(1.1.2) cluster from my local > Windows machine via Spring data(later, eclipse plug-in may also be used).= In > future, multiple such connections from several Windows machines are > expected. > > > > On my remote(single-node) cluster, bigdata is the user for Hadoop etc. > > bigdata@cloudx-843-770:~$ groups bigdata > > bigdata : bigdata > > On my local Windows machine > > D:\>echo %username% > > 298790 > > D:\>hostname > > INFVA03351 > > > > Now if I refer to Hadoop Secure Impersonation., does it mean I need to > create a user 298790 on the cluster, add the hostname in core-site.xml et= c. > ??? Any less-cumbersome ways out? I tried that too on the cluster but the > (partial given)output error still persists : > > > > Aug 22, 2013 12:29:20 PM > org.springframework.context.support.AbstractApplicationContext > prepareRefresh > > INFO: Refreshing > org.springframework.context.support.ClassPathXmlApplicationContext@181533= 8: > startup date [Thu Aug 22 12:29:20 IST 2013]; root of context hierarchy > > Aug 22, 2013 12:29:20 PM > org.springframework.beans.factory.xml.XmlBeanDefinitionReader > loadBeanDefinitions > > INFO: Loading XML bean definitions from class path resource > [com/hadoop/basics/applicationContext.xml] > > Aug 22, 2013 12:29:20 PM > org.springframework.core.io.support.PropertiesLoaderSupport loadPropertie= s > > INFO: Loading properties file from class path resource > [resources/hadoop.properties] > > Aug 22, 2013 12:29:20 PM > org.springframework.beans.factory.support.DefaultListableBeanFactory > preInstantiateSingletons > > INFO: Pre-instantiating singletons in > org.springframework.beans.factory.support.DefaultListableBeanFactory@7c19= 7e: > defining beans > [org.springframework.context.support.PropertySourcesPlaceholderConfigurer= #0,hadoopConfiguration,wc-job,myjobs-runner,resourceLoader]; > root of factory hierarchy > > Aug 22, 2013 12:29:21 PM > org.springframework.data.hadoop.mapreduce.JobExecutor$2 run > > INFO: Starting job [wc-job] > > Aug 22, 2013 12:29:21 PM org.apache.hadoop.security.UserGroupInformation > doAs > > SEVERE: PriviledgedActionException as:bigdata via 298790 > cause:org.apache.hadoop.ipc.RemoteException: User: 298790 is not allowed = to > impersonate bigdata > > Aug 22, 2013 12:29:21 PM > org.springframework.data.hadoop.mapreduce.JobExecutor$2 run > > WARNING: Cannot start job [wc-job] > > org.apache.hadoop.ipc.RemoteException: User: 298790 is not allowed to > impersonate bigdata > > at org.apache.hadoop.ipc.Client.call(Client.java:1107) > > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) > > at org.apache.hadoop.mapred.$Proxy2.getProtocolVersion(Unknown Sour= ce) > > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:411) > > at > org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:499) > > at org.apache.hadoop.mapred.JobClient.init(JobClient.java:490) > > at org.apache.hadoop.mapred.JobClient.(JobClient.java:473) > > at org.apache.hadoop.mapreduce.Job$1.run(Job.java:513) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Unknown Source) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1149) > > at org.apache.hadoop.mapreduce.Job.connect(Job.java:511) > > at org.apache.hadoop.mapreduce.Job.submit(Job.java:499) > > at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530) > > at > org.springframework.data.hadoop.mapreduce.JobExecutor$2.run(JobExecutor.j= ava:197) > > at > org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.j= ava:49) > > at > org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecut= or.java:168) > > at > org.springframework.data.hadoop.mapreduce.JobExecutor.startJobs(JobExecut= or.java:160) > > at > org.springframework.data.hadoop.mapreduce.JobRunner.call(JobRunner.java:5= 2) > > at > org.springframework.data.hadoop.mapreduce.JobRunner.afterPropertiesSet(Jo= bRunner.java:44) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1541) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.initializeBean(AbstractAutowireCapableBeanFactory.java:1479) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.doCreateBean(AbstractAutowireCapableBeanFactory.java:521) > > at > org.springframework.beans.factory.support.AbstractAutowireCapableBeanFact= ory.createBean(AbstractAutowireCapableBeanFactory.java:458) > > at > org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject= (AbstractBeanFactory.java:295) > > at > org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.ge= tSingleton(DefaultSingletonBeanRegistry.java:223) > > at > org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(A= bstractBeanFactory.java:292) > > at > org.springframework.beans.factory.support.AbstractBeanFactory.getBean(Abs= tractBeanFactory.java:194) > > at > org.springframework.beans.factory.support.DefaultListableBeanFactory.preI= nstantiateSingletons(DefaultListableBeanFactory.java:628) > > at > org.springframework.context.support.AbstractApplicationContext.finishBean= FactoryInitialization(AbstractApplicationContext.java:932) > > at > org.springframework.context.support.AbstractApplicationContext.refresh(Ab= stractApplicationContext.java:479) > > at > org.springframework.context.support.ClassPathXmlApplicationContext.= (ClassPathXmlApplicationContext.java:197) > > at > org.springframework.context.support.ClassPathXmlApplicationContext.= (ClassPathXmlApplicationContext.java:172) > > at > org.springframework.context.support.ClassPathXmlApplicationContext.= (ClassPathXmlApplicationContext.java:158) > > at com.hadoop.basics.WordCounter.main(WordCounter.java:58) > > > > Regards, > > Omkar Joshi > > > > > ________________________________ > The contents of this e-mail and any attachment(s) may contain confidentia= l > or privileged information for the intended recipient(s). Unintended > recipients are prohibited from taking action on the basis of information = in > this e-mail and using or disseminating the information, and must notify t= he > sender and delete it from their system. L&T Infotech will not accept > responsibility or liability for the accuracy or completeness of, or the > presence of any virus or disabling code in this e-mail" --=20 Harsh J