Return-Path: X-Original-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id B004A11E00 for ; Tue, 24 Jun 2014 18:24:42 +0000 (UTC) Received: (qmail 16207 invoked by uid 500); 24 Jun 2014 18:24:41 -0000 Delivered-To: apmail-hadoop-hdfs-dev-archive@hadoop.apache.org Received: (qmail 16025 invoked by uid 500); 24 Jun 2014 18:24:41 -0000 Mailing-List: contact hdfs-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-dev@hadoop.apache.org Delivered-To: mailing list hdfs-dev@hadoop.apache.org Received: (qmail 15514 invoked by uid 99); 24 Jun 2014 18:24:41 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 24 Jun 2014 18:24:41 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of cnauroth@hortonworks.com designates 209.85.216.53 as permitted sender) Received: from [209.85.216.53] (HELO mail-qa0-f53.google.com) (209.85.216.53) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 24 Jun 2014 18:24:37 +0000 Received: by mail-qa0-f53.google.com with SMTP id j15so577742qaq.26 for ; Tue, 24 Jun 2014 11:24:16 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:cc:content-type; bh=Xs2abCFP0B0t//iyQYh8+Hh2BSEQetYoTsSIiSRTngg=; b=etOMVh3j6kdjjKSjc62UFehpe3DNZT7AyoZ7O5RRVYRnYR7FbSLKBiQC/sxuGOOJe/ Km93fyjz//PMZ7PM7Vezs11ageDd+zCgb0vJxmNdhXuAYPPeK9Ezp65FHUWUMUEU82gM Yh+mNY4AB70wpzimqvnW3cWCl82ESzmROUAhO9k27ZubkdXbTa4aCpsn/i86iMMVoEHL 8Y42nFPbhhYFgO6x+Lk565XDJ3WMr/AKpsEGuLCTJ89QidsHZ/PZUDR4AWRPurpeWLJP 2ii96y1DAAViuGutdcFq+B4M6Z0nvSMoZQuyydBoO/ZNRAUTTy3ZVTil5r3X7ZiIB0xH w9Cg== X-Gm-Message-State: ALoCoQmzsDroOUURLv5V7mhb6UqGlg8Fx/ZXMVwL2wu1BluvMsJcNIrH6/kQxc3aWTBJ6kpUP43uV5LOUVzeLuLgjg3rV7MXP+tHZzMWIu8E62/l7YjYEic= MIME-Version: 1.0 X-Received: by 10.140.39.164 with SMTP id v33mr4008664qgv.99.1403634256212; Tue, 24 Jun 2014 11:24:16 -0700 (PDT) Received: by 10.96.138.228 with HTTP; Tue, 24 Jun 2014 11:24:16 -0700 (PDT) In-Reply-To: <621E2E20-6541-4666-A576-F010A34AE7F1@gmail.com> References: <621E2E20-6541-4666-A576-F010A34AE7F1@gmail.com> Date: Tue, 24 Jun 2014 11:24:16 -0700 Message-ID: Subject: Re: "SIMPLE authentication is not enabled" error for secured hdfs read From: Chris Nauroth To: "yarn-dev@hadoop.apache.org" Cc: "hdfs-dev@hadoop.apache.org" , hdfs-issues@hadoop.apache.org, yarn-issues@hadoop.apache.org, "mapreduce-dev@hadoop.apache.org" Content-Type: multipart/alternative; boundary=001a11c14eae3c7fe304fc9911bc X-Virus-Checked: Checked by ClamAV on apache.org --001a11c14eae3c7fe304fc9911bc Content-Type: text/plain; charset=UTF-8 Hi David, UserGroupInformation.createRemoteUser does not attach credentials to the returned ugi. I expect the server side is rejecting the connection due to lack of credentials. This is actually by design. The UserGroupInformation.createRemoteUser method is primarily intended for use on the server side when it wants to run a piece of its code while impersonating the client. I'd say that your second code sample is the correct one. After running kinit to get credentials, you can just run your code. I expect Kerberos authentication to work without taking any special measures to call UserGroupInformation directly from your code. Hope this helps. Chris Nauroth Hortonworks http://hortonworks.com/ On Tue, Jun 24, 2014 at 6:29 AM, Liu, David wrote: > Hi experts, > > After kinit hadoop, When I run this java file on a secured hadoop cluster, > I met the following error: > 14/06/24 16:53:41 ERROR security.UserGroupInformation: > PriviledgedActionException as:hdfs (auth:SIMPLE) > cause:org.apache.hadoop.security.AccessControlException: Client cannot > authenticate via:[TOKEN, KERBEROS] > 14/06/24 16:53:41 WARN ipc.Client: Exception encountered while connecting > to the server : org.apache.hadoop.security.AccessControlException: Client > cannot authenticate via:[TOKEN, KERBEROS] > 14/06/24 16:53:41 ERROR security.UserGroupInformation: > PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException: > org.apache.hadoop.security.AccessControlException: Client cannot > authenticate via:[TOKEN, KERBEROS] > 14/06/24 16:53:41 ERROR security.UserGroupInformation: > PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException: > Failed on local exception: java.io.IOException: > org.apache.hadoop.security.AccessControlException: Client cannot > authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: > "hdsh2-a161/10.62.66.161"; destination host is: "hdsh2-a161.lss.emc.com > ":8020; > Exception in thread "main" java.io.IOException: Failed on local exception: > java.io.IOException: org.apache.hadoop.security.AccessControlException: > Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host > is: "hdsh2-a161/10.62.66.161"; destination host is: " > hdsh2-a161.lss.emc.com":8020; > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764) > at org.apache.hadoop.ipc.Client.call(Client.java:1351) > at org.apache.hadoop.ipc.Client.call(Client.java:1300) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) > at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) > at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:191) > at > org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1067) > at > org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1057) > at > org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1047) > at > org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:235) > at > org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:202) > at > org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:195) > at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1215) > at > org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:290) > at > org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:286) > at > org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) > at > org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:286) > at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:763) > at Testhdfs$1.run(Testhdfs.java:43) > at Testhdfs$1.run(Testhdfs.java:30) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) > at Testhdfs.main(Testhdfs.java:30) > > > Here is my code: > > UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hadoop"); > ugi.doAs(new PrivilegedExceptionAction() { > public Void run() throws Exception { > Configuration conf = new Configuration(); > FileSystem fs = > FileSystem.get(URI.create(uri), conf); > FSDataInputStream in = fs.open(new > Path(uri)); > IOUtils.copy(in, System.out, 4096); > return null; > } > }); > > But when I run it without UserGroupInformation, like this on the same > cluster with the same user, the code works fine. > Configuration conf = new Configuration(); > FileSystem fs = > FileSystem.get(URI.create(uri), conf); > FSDataInputStream in = fs.open(new > Path(uri)); > IOUtils.copy(in, System.out, 4096); > > Could anyone help me? > > Thanks -- CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You. --001a11c14eae3c7fe304fc9911bc--