Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6A35310B04 for ; Mon, 20 Jan 2014 11:34:50 +0000 (UTC) Received: (qmail 28845 invoked by uid 500); 20 Jan 2014 11:34:41 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 28367 invoked by uid 500); 20 Jan 2014 11:34:40 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 28355 invoked by uid 99); 20 Jan 2014 11:34:40 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 20 Jan 2014 11:34:40 +0000 X-ASF-Spam-Status: No, hits=3.5 required=5.0 tests=FB_GET_MEDS,HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of elazarl@gmail.com designates 209.85.219.54 as permitted sender) Received: from [209.85.219.54] (HELO mail-oa0-f54.google.com) (209.85.219.54) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 20 Jan 2014 11:34:34 +0000 Received: by mail-oa0-f54.google.com with SMTP id i4so803133oah.13 for ; Mon, 20 Jan 2014 03:34:14 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=Euum+8PxVOCNJasgMTtJDeYqRAutJxyYQvgxLjnpLa0=; b=f7dvyBlbn7SVZ3ZBMMbDYMDZqftyvW/H8CiBgUodjTrloM+lVK+cUTxtpjkdYePWrm sgODQLNdfrJG4Cp1Vix5Mh0Eef379sPUXmOLJVqHAXX9iMjE62ew7Ugb4u+a5Xjg0zdx 4YdC8Z8LcwRsbrh0qvWtGRlK6Mt74TQGjBnqL6QlaT0VpMAcT42kQ4SzBJabjisRZppm pDrMJf8//XfMQhjwOGg0OG14yJWtZL8Z2uU+kL3GBhhvxPkvcUC5lqPCerSxDgFwYXZ5 UOrTRglMR18wMsMuYvlMO0aOwmRZnidmRbNfYH6O5hZqD/CeSis6KqNa/X7iat4Lv0xL O6Qw== MIME-Version: 1.0 X-Received: by 10.60.98.101 with SMTP id eh5mr1222624oeb.53.1390217653915; Mon, 20 Jan 2014 03:34:13 -0800 (PST) Received: by 10.60.97.166 with HTTP; Mon, 20 Jan 2014 03:34:13 -0800 (PST) In-Reply-To: References: Date: Mon, 20 Jan 2014 13:34:13 +0200 Message-ID: Subject: Re: "Server not found in Kerberos database" for MiniKDC server From: Elazar Leibovich To: user Content-Type: multipart/alternative; boundary=089e0116167e6c15d504f06545d9 X-Virus-Checked: Checked by ClamAV on apache.org --089e0116167e6c15d504f06545d9 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable For the sake of completion. The same settings worked in a Linux box. On Wed, Jan 15, 2014 at 10:57 PM, Elazar Leibovich wrote= : > Hi, > > For educational purposes, I'm trying to set a minimal working secure > Hadoop cluster on my machine. > > What I basically did is: > > Add example.com to /etc/hosts > > Set a minkdc server. It'll generate krb5.conf and keytab. Generates some > users - {nn,dn,hdfs}@EXAMPLE.COM > > Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaroun= d > for Mac OS X java: > > =E2=9D=AF ~/hadoopconf env HADOOP_OPTS > hadoop-env.sh HADOOP_OPTS =3D -Djava.awt.headless=3Dtrue > -Djava.security.krb5.conf=3D/Users/eleibovi/dev/securehadoop/hadoop_rpc_w= alktrhough/krb5.conf > -Djava.net.preferIPv4Stack=3Dtrue > > > Set the proper hadoop configuration using the keytab and the hadoop users= : > > =E2=9D=AF ~/hadoopconf get --local > hdfs-site.xml dfs.datanode.address =3D example.com:1004 > core-site.xml fs.defaultFS =3D hdfs://example.com > hdfs-site.xml dfs.namenode.keytab.file =3D > /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab > hdfs-site.xml dfs.datanode.hostname =3D example.com > hdfs-site.xml dfs.datanode.kerberos.principal =3D dn/EXAMPLE.COM@EXAMPLE.= COM > hdfs-site.xml dfs.datanode.data.dir =3D > /tmp/hadoop-eleibovi/dfs/data > hdfs-site.xml dfs.datanode.keytab.file =3D > /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab > hdfs-site.xml dfs.namenode.kerberos.principal =3D nn/EXAMPLE.COM@EXAMPLE.= COM > core-site.xml hadoop.security.authorization =3D true > core-site.xml hadoop.security.authentication =3D kerberos > hdfs-site.xml dfs.datanode.dns.interface =3D lo0 > hdfs-site.xml dfs.datanode.http.address =3D example.com:1006 > > Start the namenode service. > $ ./bin/hdfs > ... > 14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting > 14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/ > 127.0.0.1:8020 > 14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required > for active state > > Finally use the following short Java program to contact the namenode: > > System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf"); > UserGroupInformation.setConfiguration(conf); > UserGroupInformation ugi =3D UserGroupInformation. > loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd + > "/keytab"); > ugi.doAs(new PrivilegedExceptionAction() { > @Override > public Object run() throws Exception { > final FileSystem fs =3D FileSystem.get(conf); > fs.getFileStatus(new Path("/")); > } > } > > The exception I got is: > > Exception in thread "main" java.io.IOException: Failed on local exception= : > java.io.IOException: javax.security.sasl.SaslException: GSS initiate fail= ed > [Caused by GSSException: No valid credentials provided (Mechanism level: > Server not found in Kerberos database (7) - Server not found in Kerberos > database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1"; > destination host is: "example.com":8020; > > I'll be glad to any help with debugging the problem. > > Thanks, > > I attach a full log with Kerberos debug turned on: > > args: [-conf, > /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.x= ml, > -conf, > /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.x= ml] > 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field > org.apache.hadoop.metrics2.lib.MutableRate > org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess > with annotation > @org.apache.hadoop.metrics2.annotation.Metric(valueName=3DTime, about=3D, > value=3D[Rate of successful kerberos logins and latency (milliseconds)], > always=3Dfalse, type=3DDEFAULT, sampleName=3DOps) > 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field > org.apache.hadoop.metrics2.lib.MutableRate > org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure > with annotation > @org.apache.hadoop.metrics2.annotation.Metric(valueName=3DTime, about=3D, > value=3D[Rate of failed kerberos logins and latency (milliseconds)], > always=3Dfalse, type=3DDEFAULT, sampleName=3DOps) > 2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and > group related metrics > Java config name: > /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf > Loaded from Java config > 2014-01-15 19:29:46 DEBUG Groups:180 - Creating new Groups object > 2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load the > custom-built native-hadoop library... > 2014-01-15 19:29:46 DEBUG NativeCodeLoader:55 - Failed to load > native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in > java.library.path > 2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 - > java.library.path=3D/Users/eleibovi/Library/Java/Extensions:/Library/Java= /Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extension= s:/usr/lib/java:. > 2014-01-15 19:29:46 WARN NativeCodeLoader:62 - Unable to load > native-hadoop library for your platform... using builtin-java classes whe= re > applicable > 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:40 - > Falling back to shell based > 2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Grou= p > mapping impl=3Dorg.apache.hadoop.security.ShellBasedUnixGroupsMapping > 2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping > impl=3Dorg.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; > cacheTimeout=3D300000 > Java config name: > /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf > Loaded from Java config > >>> KdcAccessibility: reset > >>> KdcAccessibility: reset > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): nn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 53; type: 3 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): nn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 69; type: 16 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): nn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 61; type: 17 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): nn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 61; type: 23 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): dn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 53; type: 3 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): dn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 69; type: 16 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): dn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 61; type: 17 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): dn > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 61; type: 23 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): eleibovi > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 59; type: 3 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): eleibovi > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 75; type: 16 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): eleibovi > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 67; type: 17 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): eleibovi > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 67; type: 23 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 55; type: 3 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 71; type: 16 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 63; type: 17 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTab: load() entry length: 63; type: 23 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTab: load() entry length: 42; type: 3 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTab: load() entry length: 58; type: 16 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTab: load() entry length: 50; type: 17 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): hdfs > >>> KeyTab: load() entry length: 50; type: 23 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): root > >>> KeyTab: load() entry length: 42; type: 3 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): root > >>> KeyTab: load() entry length: 58; type: 16 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): root > >>> KeyTab: load() entry length: 50; type: 17 > >>> KeyTabInputStream, readName(): EXAMPLE.COM > >>> KeyTabInputStream, readName(): root > >>> KeyTab: load() entry length: 50; type: 23 > Added key: 23version: 0 > Added key: 17version: 0 > Added key: 16version: 0 > Added key: 3version: 0 > Ordering keys wrt default_tkt_enctypes list > Using builtin default etypes for default_tkt_enctypes > default etypes for default_tkt_enctypes: 18 17 16 23 1 3. > Added key: 23version: 0 > Added key: 17version: 0 > Added key: 16version: 0 > Added key: 3version: 0 > Ordering keys wrt default_tkt_enctypes list > Using builtin default etypes for default_tkt_enctypes > default etypes for default_tkt_enctypes: 18 17 16 23 1 3. > Using builtin default etypes for default_tkt_enctypes > default etypes for default_tkt_enctypes: 18 17 16 23 1 3. > >>> KrbAsReq creating message > >>> KrbKdcReq send: kdc=3Dlocalhost TCP:50064, timeout=3D30000, number of > retries =3D3, #bytes=3D158 > >>> KDCCommunication: kdc=3Dlocalhost TCP:50064, timeout=3D30000,Attempt = =3D1, > #bytes=3D158 > >>>DEBUG: TCPClient reading 529 bytes > >>> KrbKdcReq send: #bytes read=3D529 > >>> KdcAccessibility: remove localhost:50064 > Added key: 23version: 0 > Added key: 17version: 0 > Added key: 16version: 0 > Added key: 3version: 0 > Ordering keys wrt default_tkt_enctypes list > Using builtin default etypes for default_tkt_enctypes > default etypes for default_tkt_enctypes: 18 17 16 23 1 3. > >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType > >>> KrbAsRep cons in KrbAsReq.getReply hdfs/EXAMPLE.COM > 2014-01-15 19:29:47 DEBUG UserGroupInformation:176 - hadoop login > Added key: 23version: 0 > Added key: 17version: 0 > Added key: 16version: 0 > Added key: 3version: 0 > Ordering keys wrt default_tkt_enctypes list > Using builtin default etypes for default_tkt_enctypes > default etypes for default_tkt_enctypes: 18 17 16 23 1 3. > 2014-01-15 19:29:47 DEBUG UserGroupInformation:125 - hadoop login commit > 2014-01-15 19:29:47 DEBUG UserGroupInformation:139 - using kerberos > user:hdfs/EXAMPLE.COM@EXAMPLE.COM > 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction > as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS) > from:com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUs= age.java:34) > >>KERBEROS > 2014-01-15 19:29:47 DEBUG BlockReaderLocal:326 - > dfs.client.use.legacy.blockreader.local =3D false > 2014-01-15 19:29:47 DEBUG BlockReaderLocal:329 - > dfs.client.read.shortcircuit =3D false > 2014-01-15 19:29:47 DEBUG BlockReaderLocal:332 - > dfs.client.domain.socket.data.traffic =3D false > 2014-01-15 19:29:47 DEBUG BlockReaderLocal:335 - dfs.domain.socket.path = =3D > 2014-01-15 19:29:47 DEBUG MetricsSystemImpl:220 - StartupProgress, > NameNode startup progress > 2014-01-15 19:29:47 DEBUG RetryUtils:74 - multipleLinearRandomRetry =3D n= ull > 2014-01-15 19:29:47 DEBUG Server:220 - rpcKind=3DRPC_PROTOCOL_BUFFER, > rpcRequestWrapperClass=3Dclass > org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, > rpcInvoker=3Dorg.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcIn= voker@379b0d9c > 2014-01-15 19:29:47 DEBUG BlockReaderLocal:63 - Both short-circuit local > reads and UNIX domain socket are disabled. > 2014-01-15 19:29:47 DEBUG Shell:237 - Failed to detect a valid hadoop hom= e > directory > java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set. > at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:219) > at org.apache.hadoop.util.Shell.(Shell.java:244) > at org.apache.hadoop.util.StringUtils.(StringUtils.java:76) > at > org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java= :1539) > at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:492) > at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:445) > at > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSy= stem.java:136) > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2429) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88) > at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2463= ) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2445) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:363) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:165) > at > com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage= .java:38) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1477) > at > com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.j= ava:34) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) > at > com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.= java:18) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120) > 2014-01-15 19:29:47 DEBUG Shell:316 - setsid is not available on this > machine. So not using it. > 2014-01-15 19:29:47 DEBUG Shell:320 - setsid exited with exit code 0 > 2014-01-15 19:29:47 DEBUG Client:371 - The ping interval is 60000 ms. > 2014-01-15 19:29:47 DEBUG Client:636 - Connecting to > example.com/127.0.0.1:8020 > 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction > as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:6= 54) > 2014-01-15 19:29:47 DEBUG SaslRpcClient:438 - Sending sasl message state: > NEGOTIATE > > 2014-01-15 19:29:47 DEBUG SaslRpcClient:370 - Received SASL message state= : > NEGOTIATE > auths { > method: "TOKEN" > mechanism: "DIGEST-MD5" > protocol: "" > serverId: "default" > challenge: > "realm=3D\"default\",nonce=3D\"Evyt9cWyZFDUbCtQYNcgF5FY7rsBqxVCgtggY48n\"= ,qop=3D\"auth\",charset=3Dutf-8,algorithm=3Dmd5-sess" > } > auths { > method: "KERBEROS" > mechanism: "GSSAPI" > protocol: "nn" > serverId: "EXAMPLE.COM" > } > > 2014-01-15 19:29:47 DEBUG SaslRpcClient:259 - Get token info > proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolP= B > info:@org.apache.hadoop.security.token.TokenInfo(value=3Dclass > org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector) > 2014-01-15 19:29:47 DEBUG SaslRpcClient:287 - Get kerberos info > proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolP= B > info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=3D, > serverPrincipal=3Ddfs.namenode.kerberos.principal) > 2014-01-15 19:29:47 DEBUG SaslRpcClient:231 - RPC Server's Kerberos > principal name for > protocol=3Dorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is = nn/ > EXAMPLE.COM@EXAMPLE.COM > 2014-01-15 19:29:47 DEBUG SaslRpcClient:242 - Creating SASL > GSSAPI(KERBEROS) client to authenticate to service at EXAMPLE.COM > 2014-01-15 19:29:47 DEBUG SaslRpcClient:172 - Use KERBEROS authentication > for protocol ClientNamenodeProtocolPB > Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/ > EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014 > Entered Krb5Context.initSecContext with state=3DSTATE_NEW > Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/ > EXAMPLE.COM@EXAMPLE.COM expiring on Thu Jan 16 19:29:46 IST 2014 > Service ticket not found in the subject > >>> Credentials acquireServiceCreds: same realm > Using builtin default etypes for default_tgs_enctypes > default etypes for default_tgs_enctypes: 18 17 16 23 1 3. > >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType > >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType > >>> KrbKdcReq send: kdc=3Dlocalhost TCP:50064, timeout=3D30000, number of > retries =3D3, #bytes=3D583 > >>> KDCCommunication: kdc=3Dlocalhost TCP:50064, timeout=3D30000,Attempt = =3D1, > #bytes=3D583 > >>>DEBUG: TCPClient reading 135 bytes > >>> KrbKdcReq send: #bytes read=3D135 > >>> KdcAccessibility: remove localhost:50064 > >>> KDCRep: init() encoding tag is 126 req type is 13 > >>>KRBError: > sTime is Wed Jan 15 19:29:47 IST 2014 1389806987000 > suSec is 0 > error code is 7 > error Message is Server not found in Kerberos database > realm is EXAMPLE.COM > sname is krbtgt/EXAMPLE.COM > msgType is 30 > 2014-01-15 19:29:47 ERROR UserGroupInformation:1480 - > PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS)= cause:javax.security.sasl.SaslException: GSS initiate > failed [Caused by GSSException: No valid credentials provided (Mechanism > level: Server not found in Kerberos database (7) - Server not found in > Kerberos database)] > 2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction > as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS) > from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(= Client.java:583) > >>>KinitOptions cache name is /Users/eleibovi/krb5cc_eleibovi > >> Acquire default native Credentials > >>> Obtained TGT from LSA: Credentials: > *...Here it looks like it tries to get the machines kerberos tickets...* > client=3Deleibovi@[snipped computer Kerberos server from /etc/krb5.conf] > server=3Dkrbtgt/[snipped computer Kerberos server from /etc/krb5.conf] > authTime=3D20140109035156Z > startTime=3D20140109115308Z > endTime=3D20140109215308Z > renewTill=3D20140116035156Z > flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT > EType (int): 18 > Using builtin default etypes for default_tgs_enctypes > default etypes for default_tgs_enctypes: 18 17 16 23 1 3. > >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType > >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType > getKDCFromDNS using UDP > getKDCFromDNS using TCP > 2014-01-15 19:30:18 DEBUG UserGroupInformation:176 - hadoop login > 2014-01-15 19:30:18 DEBUG UserGroupInformation:125 - hadoop login commit > 2014-01-15 19:30:18 DEBUG UserGroupInformation:139 - using kerberos > user:null > 2014-01-15 19:30:18 DEBUG UserGroupInformation:155 - using local > user:UnixPrincipal: eleibovi > 2014-01-15 19:30:18 DEBUG UserGroupInformation:696 - UGI > loginUser:eleibovi (auth:KERBEROS) > 2014-01-15 19:30:18 WARN Client:615 - Exception encountered while > connecting to the server : javax.security.sasl.SaslException: GSS initiat= e > failed [Caused by GSSException: No valid credentials provided (Mechanism > level: Server not found in Kerberos database (7) - Server not found in > Kerberos database)] > 2014-01-15 19:30:18 ERROR UserGroupInformation:1480 - > PriviledgedActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM(auth:KERBEROS)= cause:java.io.IOException: > javax.security.sasl.SaslException: GSS initiate failed [Caused by > GSSException: No valid credentials provided (Mechanism level: Server not > found in Kerberos database (7) - Server not found in Kerberos database)] > 2014-01-15 19:30:18 DEBUG Client:1099 - closing ipc connection to > example.com/127.0.0.1:8020: javax.security.sasl.SaslException: GSS > initiate failed [Caused by GSSException: No valid credentials provided > (Mechanism level: Server not found in Kerberos database (7) - Server not > found in Kerberos database)] > java.io.IOException: javax.security.sasl.SaslException: GSS initiate > failed [Caused by GSSException: No valid credentials provided (Mechanism > level: Server not found in Kerberos database (7) - Server not found in > Kerberos database)] > at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1477) > at > org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Clien= t.java:583) > at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667= ) > at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314) > at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399) > at org.apache.hadoop.ipc.Client.call(Client.java:1318) > at org.apache.hadoop.ipc.Client.call(Client.java:1300) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.= java:206) > at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvoc= ationHandler.java:188) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationH= andler.java:102) > at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getF= ileInfo(ClientNamenodeProtocolTranslatorPB.java:651) > at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1636) > at > org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSys= tem.java:1117) > at > org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSys= tem.java:1113) > at > org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolve= r.java:78) > at > org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFil= eSystem.java:1113) > at > com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(HadoopBasicUsage= .java:40) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1477) > at > com.github.elazar.hadoop.examples.HadoopBasicUsage.run(HadoopBasicUsage.j= ava:34) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) > at > com.github.elazar.hadoop.examples.HadoopBasicUsage.main(HadoopBasicUsage.= java:18) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120) > Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused > by GSSException: No valid credentials provided (Mechanism level: Server n= ot > found in Kerberos database (7) - Server not found in Kerberos database)] > at > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Clie= nt.java:212) > at > org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:3= 94) > at > org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:4= 94) > at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314) > at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659) > at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1477) > at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654= ) > ... 32 more > Caused by: GSSException: No valid credentials provided (Mechanism level: > Server not found in Kerberos database (7) - Server not found in Kerberos > database) > at > sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:710) > at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:24= 8) > at > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) > at > com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Clie= nt.java:193) > ... 41 more > Caused by: KrbException: Server not found in Kerberos database (7) - > Server not found in Kerberos database > at sun.security.krb5.KrbTgsRep.(KrbTgsRep.java:73) > at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:192) > at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:203) > at > sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.j= ava:311) > at > sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(Credential= sUtil.java:115) > at > sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449) > at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641= ) > ... 44 more > Caused by: KrbException: Identifier doesn't match expected value (906) > at sun.security.krb5.internal.KDCRep.init(KDCRep.java:143) > at sun.security.krb5.internal.TGSRep.init(TGSRep.java:66) > at sun.security.krb5.internal.TGSRep.(TGSRep.java:61) > at sun.security.krb5.KrbTgsRep.(KrbTgsRep.java:55) > ... 50 more > 2014-01-15 19:30:18 DEBUG Client:1107 - IPC Client (23781497) connection > to example.com/127.0.0.1:8020 from hdfs/EXAMPLE.COM@EXAMPLE.COM: closed > --089e0116167e6c15d504f06545d9 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
For the sake of completion.

= The same settings worked in a Linux box.


On Wed, Jan 15, 2014 at 10:57 PM, El= azar Leibovich <elazarl@gmail.com> wrote:
Hi,

For = educational purposes, I'm trying to set a minimal working secure Hadoop= cluster on my machine.

What I basically did is:

Add <= a href=3D"http://example.com" target=3D"_blank">example.com to /etc/hos= ts

Set a minkdc server. It'll generate krb5.conf and k= eytab. Generates some users - {nn,dn,hdfs}@EXAMPLE.COM

Refer Jav= a to krb5.conf with HADOOP_OPTS, as well as a required workaround for Mac O= S X java:

=E2=9D=AF ~/hadoopconf env HADOOP_OPTS
h= adoop-env.sh HADOOP_OPTS =3D -Djava.awt.headless=3Dtrue -Djava.security.krb= 5.conf=3D/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf = -Djava.net.preferIPv4Stack=3Dtrue


Set the proper hadoop configuratio= n using the keytab and the hadoop users:

=E2= =9D=AF ~/hadoopconf get --local
hdfs-site.xml dfs.datanode.addres= s =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0=3D example.com:1004
core-site.xml fs.defaultFS =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0=3D hdfs://example.com
hdfs-site.xml dfs.namenode.keytab.fil= e =C2=A0 =C2=A0 =C2=A0 =C2=A0=3D /Users/eleibovi/dev/securehadoop/hadoop_rp= c_walktrhough/keytab
hdfs-site.xml dfs.datanode.hostname =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =3D example.com
=
hdfs-site.xml dfs.datanode.kerberos.principal =3D dn/EXAMPLE.COM@EXAMPLE.COM
hdfs-site.xml dfs.datanode.data.dir =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0= =3D /tmp/hadoop-eleibovi/dfs/data
hdfs-site.xml dfs.datanode.key= tab.file =C2=A0 =C2=A0 =C2=A0 =C2=A0=3D /Users/eleibovi/dev/securehadoop/ha= doop_rpc_walktrhough/keytab
hdfs-site.xml dfs.namenode.kerberos.p= rincipal =3D nn/EXAMPLE.COM@EXAMPLE.COM
core-site.xml hadoop.security.authorization =C2=A0 =3D true
= core-site.xml hadoop.security.authentication =C2=A0=3D kerberos
h= dfs-site.xml dfs.datanode.dns.interface =C2=A0 =C2=A0 =C2=A0=3D lo0
hdfs-site.xml dfs.datanode.http.address =C2=A0 =C2=A0 =C2=A0 =3D example.com:1006

Start the namenode service.
$ ./bin/hdf= s
...
14/01/15 19:22:43 INFO ipc.Server: IPC Serve= r listener on 8020: starting
14/01/15 19:22:43 INFO namenode.Name= Node: NameNode RPC up at: localhost/127.0.0.1:8020
14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services requir= ed for active state

Finally use the followin= g short Java program to contact the namenode:

System.setProperty("java.security.krb5.conf", cwd + "/krb5.c= onf");
UserGroupInformation.setConfiguration(conf);
=C2=A0 =C2=A0 =C2=A0 =C2=A0 UserGroupInformation ugi =3D UserGroupIn= formation.
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 loginUserFromKeytab= AndReturnUGI("hdfs/EX= AMPLE.COM", cwd + "/keytab");
=C2= =A0ugi.doAs(new PrivilegedExceptionAction<Object>() {
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 @Override
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 public Object run() throws Exception {
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 final File= System fs =3D FileSystem.get(conf);
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0=C2=A0fs.getFileStatus(new Path("/&q= uot;));
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0}
}

The exception I got is:

Exception in threa= d "main" java.io.IOException: Failed on local exception: java.io.= IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused= by GSSException: No valid credentials provided (Mechanism level: Server no= t found in Kerberos database (7) - Server not found in Kerberos database)];= Host Details : local host is: "tlv-mpbxb/127.0.0.1"; destination host is: "example.com":8020;=C2=A0=

I'll be glad to any help with debugging the p= roblem.

Thanks,

I attach = a full log with Kerberos debug turned on:

args: [-conf, /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop= /core-site.xml, -conf, /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/e= tc/hadoop/hdfs-site.xml]
2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field org.apache.= hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInforma= tion$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.an= notation.Metric(valueName=3DTime, about=3D, value=3D[Rate of successful ker= beros logins and latency (milliseconds)], always=3Dfalse, type=3DDEFAULT, s= ampleName=3DOps)
2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field org.apache.= hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInforma= tion$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.an= notation.Metric(valueName=3DTime, about=3D, value=3D[Rate of failed kerbero= s logins and latency (milliseconds)], always=3Dfalse, type=3DDEFAULT, sampl= eName=3DOps)
2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and= group related metrics
Java config name: /Users/eleibovi/dev/secu= rehadoop/hadoop_rpc_walktrhough/krb5.conf
Loaded from Java config=
2014-01-15 19:29:46 DEBUG Groups:180 - =C2=A0Creating new Groups objec= t
2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load = the custom-built native-hadoop library...
2014-01-15 19:29:46 DEB= UG NativeCodeLoader:55 - Failed to load native-hadoop with error: java.lang= .UnsatisfiedLinkError: no hadoop in java.library.path
2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 - java.library.path=3D/U= sers/eleibovi/Library/Java/Extensions:/Library/Java/Extensions:/Network/Lib= rary/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
2014-01-15 19:29:46 WARN =C2=A0NativeCodeLoader:62 - Unable to load na= tive-hadoop library for your platform... using builtin-java classes where a= pplicable
2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWith= Fallback:40 - Falling back to shell based
2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - G= roup mapping impl=3Dorg.apache.hadoop.security.ShellBasedUnixGroupsMapping<= /div>
2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping impl=3Dorg.ap= ache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=3D= 300000
Java config name: /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrho= ugh/krb5.conf
Loaded from Java config
>>> KdcA= ccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputSt= ream, readName(): nn
>>> KeyTabInputStream, readName(): = EXAMPLE.COM
>>> KeyTab: load() entry length: 53; type: 3
>&g= t;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readN= ame(): nn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() ent= ry length: 69; type: 16
>>> KeyTabInputStream, readName(= ): EXAMPLE.COM
>>> KeyTabInputStream, readName(): nn
>>> = KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 61= ; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputSt= ream, readName(): nn
>>> KeyTabInputStream, readName(): = EXAMPLE.COM
>>> KeyTab: load() entry length: 61; type: 23
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): dn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load(= ) entry length: 53; type: 3
>>> KeyTabInputStream, readN= ame(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): dn
>>> = KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: 69= ; type: 16
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputSt= ream, readName(): dn
>>> KeyTabInputStream, readName(): = EXAMPLE.COM
>>> KeyTab: load() entry length: 61; type: 17
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): dn
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load(= ) entry length: 61; type: 23
>>> KeyTabInputStream, read= Name(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): eleibovi
>>= ;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry len= gth: 59; type: 3
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputSt= ream, readName(): eleibovi
>>> KeyTabInputStream, readNa= me(): EXAMPLE.COM
>>> KeyTab: load() entry length: 75; type: 16
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): eleibovi
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load(= ) entry length: 67; type: 17
>>> KeyTabInputStream, read= Name(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): eleibovi
>>= ;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry len= gth: 67; type: 23
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputSt= ream, readName(): hdfs
>>> KeyTabInputStream, readName()= : EXAMPLE.COM
>>> KeyTab: load() entry length: 55; type: 3
>&g= t;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readN= ame(): hdfs
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load(= ) entry length: 71; type: 16
>>> KeyTabInputStream, read= Name(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): hdfs
>>>= ; KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTab: load() entry length: = 63; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputSt= ream, readName(): hdfs
>>> KeyTabInputStream, readName()= : EXAMPLE.COM
>>> KeyTab: load() entry length: 63; type: 23
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): hdfs
>>> KeyTab: load() entry length: 42; type: 3
>&g= t;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readN= ame(): hdfs
>>> KeyTab: load() entry length: 58; type: 16
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): hdfs
>>> KeyTab: load() entry length: 50; type: 17
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): hdfs
>>> KeyTab: load() entry length: 50; type: 23
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): root
>>> KeyTab: load() entry length: 42; type: 3
>&g= t;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readN= ame(): root
>>> KeyTab: load() entry length: 58; type: 16
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): root
>>> KeyTab: load() entry length: 50; type: 17
>&= gt;> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, read= Name(): root
>>> KeyTab: load() entry length: 50; type: 23
Added= key: 23version: 0
Added key: 17version: 0
Added key: 1= 6version: 0
Added key: 3version: 0
Ordering keys wrt de= fault_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
defaul= t etypes for default_tkt_enctypes: 18 17 16 23 1 3.
Added key: 23= version: 0
Added key: 17version: 0
Added key: 16version= : 0
Added key: 3version: 0
Ordering keys wrt default_tkt_enctype= s list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
Using builtin default etypes for default_tkt_enctypes
default ety= pes for default_tkt_enctypes: 18 17 16 23 1 3.
>>> KrbAs= Req creating message
>>> KrbKdcReq send: kdc=3Dlocalhost= TCP:50064, timeout=3D30000, number of retries =3D3, #bytes=3D158
>>> KDCCommunication: kdc=3Dlocalhost TCP:50064, timeout=3D30= 000,Attempt =3D1, #bytes=3D158
>>>DEBUG: TCPClient readi= ng 529 bytes
>>> KrbKdcReq send: #bytes read=3D529
=
>>> KdcAccessibility: remove localhost:50064
Added key: 23version: 0
Added key: 17version: 0
Ad= ded key: 16version: 0
Added key: 3version: 0
Ordering k= eys wrt default_tkt_enctypes list
Using builtin default etypes fo= r default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
&g= t;>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType<= /div>
>>> KrbAsRep cons in KrbAsReq.getReply hdfs/EXAMPLE.COM
2014-01-15 19:29:47 DEBUG UserGroupInformation:176 - hadoop login
Added key: 23version: 0
Added key: 17version: 0
A= dded key: 16version: 0
Added key: 3version: 0
Ordering = keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
defaul= t etypes for default_tkt_enctypes: 18 17 16 23 1 3.
2014-01-15 19= :29:47 DEBUG UserGroupInformation:125 - hadoop login commit
2014-01-15 19:29:47 DEBUG UserGroupInformation:139 - using kerberos user:hd= fs/EXAMPLE.COM= @EXAMPLE.COM
2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - PrivilegedAction= as:hdfs/EXAMP= LE.COM@EXAMPLE.COM (auth:KERBEROS) from:com.github.elazar.hadoop.exampl= es.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
>>KERBEROS
2014-01-15 19:29:47 DEBUG BlockReaderLocal:= 326 - dfs.client.use.legacy.blockreader.local =3D false
2014-01-1= 5 19:29:47 DEBUG BlockReaderLocal:329 - dfs.client.read.shortcircuit =3D fa= lse
2014-01-15 19:29:47 DEBUG BlockReaderLocal:332 - dfs.client.domain.soc= ket.data.traffic =3D false
2014-01-15 19:29:47 DEBUG BlockReaderL= ocal:335 - dfs.domain.socket.path =3D=C2=A0
2014-01-15 19:29:47 D= EBUG MetricsSystemImpl:220 - StartupProgress, NameNode startup progress
2014-01-15 19:29:47 DEBUG RetryUtils:74 - multipleLinearRandomRetry = =3D null
2014-01-15 19:29:47 DEBUG Server:220 - rpcKind=3DRPC_PRO= TOCOL_BUFFER, rpcRequestWrapperClass=3Dclass org.apache.hadoop.ipc.Protobuf= RpcEngine$RpcRequestWrapper, rpcInvoker=3Dorg.apache.hadoop.ipc.ProtobufRpc= Engine$Server$ProtoBufRpcInvoker@379b0d9c
2014-01-15 19:29:47 DEBUG BlockReaderLocal:63 - Both short-circuit loc= al reads and UNIX domain socket are disabled.
2014-01-15 19:29:47= DEBUG Shell:237 - Failed to detect a valid hadoop home directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
=
at org.apache.hadoop.util= .Shell.checkHadoopHome(Shell.java:219)
at org.apache.hadoop.util.Shell.<clinit>(Shell.jav= a:244)
at org.apache.hadoop.util= .StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.conf.Configuration.getTri= mmedStrings(Configuration.java:1539)
at org.apache.hadoop.hdfs= .DFSClient.<init>(DFSClient.java:492)
at org.apache.hadoop.hdfs.DFSClient.<init>(DF= SClient.java:445)
at org.apache.hadoop.hdfs= .DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileS= ystem.createFileSystem(FileSystem.java:2429)
at org.apache.hadoop.fs.F= ileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(F= ileSystem.java:2463)
at org.apache.hadoop.fs.F= ileSystem$Cache.get(FileSystem.java:2445)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav= a:363)
at org.apache.hadoop.fs.F= ileSystem.get(FileSystem.java:165)
at com.github.elazar.hadoop.examples.HadoopBasicUsage$1.run(= HadoopBasicUsage.java:38)
at java.security.AccessCo= ntroller.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.secu= rity.UserGroupInformation.doAs(UserGroupInformation.java:1477)
at com.github.elazar.hadoop.exam= ples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
at org.apache.hadoop.util= .ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84= )
at com.github.elazar.hado= op.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
at sun.reflect.NativeMethodAccessor= Impl.invoke0(Native Method)
at sun.reflect.NativeMeth= odAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessor= Impl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Meth= od.invoke(Method.java:606)
= at com.intellij.rt.execution.application.AppMain.main(AppMain.java:1= 20)
2014-01-15 19:29:47 DEBUG Shell:316 - setsid is not available on this = machine. So not using it.
2014-01-15 19:29:47 DEBUG Shell:320 - s= etsid exited with exit code 0
2014-01-15 19:29:47 DEBUG Client:37= 1 - The ping interval is 60000 ms.
2014-01-15 19:29:47 DEBUG Client:636 - Connecting to example.com/127.0.0.1:8020
2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - Privileg= edAction as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.C= lient$Connection.setupIOstreams(Client.java:654)
2014-01-15 19:29:47 DEBUG SaslRpcClient:438 - Sending sasl message sta= te: NEGOTIATE

2014-01-15 19:29:47 DEBUG SaslRpcCli= ent:370 - Received SASL message state: NEGOTIATE
auths {
=C2=A0 method: "TOKEN"
=C2=A0 mechanism: "DIG= EST-MD5"
=C2=A0 protocol: ""
=C2=A0 serv= erId: "default"
=C2=A0 challenge: "realm=3D\"= default\",nonce=3D\"Evyt9cWyZFDUbCtQYNcgF5FY7rsBqxVCgtggY48n\&quo= t;,qop=3D\"auth\",charset=3Dutf-8,algorithm=3Dmd5-sess"
}
auths {
=C2=A0 method: "KERBEROS"
=C2=A0 mechanism: "GSSAPI"
=C2=A0 protocol: "= ;nn"
=C2=A0 serverId: "EXAMPLE.COM"
}

2014-01-15 19:29:47 DEBUG SaslRpcClient:259= - Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientN= amenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=3D= class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelec= tor)
2014-01-15 19:29:47 DEBUG SaslRpcClient:287 - Get kerberos info proto:= interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@= org.apache.hadoop.security.KerberosInfo(clientPrincipal=3D, serverPrincipal= =3Ddfs.namenode.kerberos.principal)
2014-01-15 19:29:47 DEBUG SaslRpcClient:231 - RPC Server's Kerbero= s principal name for protocol=3Dorg.apache.hadoop.hdfs.protocolPB.ClientNam= enodeProtocolPB is nn/EXAMPLE.COM@EXAMPLE.COM
2014-01-15 19:29:47 DEBUG SaslRpcClient:242 - Creating SASL GSSAPI(KER= BEROS) =C2=A0client to authenticate to service at EXAMPLE.COM
2014-01-15 19:29:47 DEBUG= SaslRpcClient:172 - Use KERBEROS authentication for protocol ClientNamenod= eProtocolPB
Found ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expi= ring on Thu Jan 16 19:29:46 IST 2014
Entered Krb5Context.initSecContext with state=3DSTATE_NEW
Fo= und ticket for hdfs/EXAMPLE.COM@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on= Thu Jan 16 19:29:46 IST 2014
Service ticket not found in the subject
>>> Credent= ials acquireServiceCreds: same realm
Using builtin default etypes= for default_tgs_enctypes
default etypes for default_tgs_enctypes= : 18 17 16 23 1 3.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumT= ype
>>> EType: sun.security.krb5.internal.crypto.Aes128C= tsHmacSha1EType
>>> KrbKdcReq send: kdc=3Dlocalhost TCP:= 50064, timeout=3D30000, number of retries =3D3, #bytes=3D583
>>> KDCCommunication: kdc=3Dlocalhost TCP:50064, timeout=3D30= 000,Attempt =3D1, #bytes=3D583
>>>DEBUG: TCPC= lient reading 135 bytes
>>> KrbKdcReq send: #= bytes read=3D135
>>> KdcAccessibility: remove localhost:50064
>&g= t;> KDCRep: init() encoding tag is 126 req type is 13
>>= >KRBError:
sTime= is Wed Jan 15 19:29:47 IST 2014 1389806987000
suSec is 0
error code is 7
error Message is Server not found i= n Kerberos database
realm is EXAMPLE.COM
sname is krbtgt/EXAMPLE.COM
msgType is 30
2014-01-15 19:29:47 ERROR UserGroupInformation:1480 - Priviledge= dActionException as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS) cause:javax.securit= y.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid= credentials provided (Mechanism level: Server not found in Kerberos databa= se (7) - Server not found in Kerberos database)]
2014-01-15 19:29:47 DEBUG UserGroupInformation:1499 - Privi= legedAction as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS) from:org.apache.hadoop.ip= c.Client$Connection.handleSaslConnectionFailure(Client.java:583)
>>>KinitOptions cache name is /Users/eleibovi/krb5cc_eleibovi=
>> Acquire default native Credentials
>>&g= t; Obtained TGT from LSA: Credentials:
...Here it looks = like it tries to get the machines kerberos tickets...
client=3Deleibovi@[snipped computer Kerberos server from /etc/krb= 5.conf]
server=3Dkrbtgt/[snipped computer Kerberos server from /e= tc/krb5.conf]
authTime=3D20140109035156Z
startTime=3D20= 140109115308Z
endTime=3D20140109215308Z
renewTill=3D20140116035156Z
<= div>flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT
EType (int):= 18
Using builtin default etypes for default_tgs_enctypes
default etypes for default_tgs_enctypes: 18 17 16 23 1 3.
>>= ;> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1ET= ype
getKDCFromDNS using UDP
getKDCFromDNS using TCP
20= 14-01-15 19:30:18 DEBUG UserGroupInformation:176 - hadoop login
2= 014-01-15 19:30:18 DEBUG UserGroupInformation:125 - hadoop login commit
2014-01-15 19:30:18 DEBUG UserGroupInformation:139 - using kerberos us= er:null
2014-01-15 19:30:18 DEBUG UserGroupInformation:155 - usin= g local user:UnixPrincipal: eleibovi
2014-01-15 19:30:18 DEBUG Us= erGroupInformation:696 - UGI loginUser:eleibovi (auth:KERBEROS)
2014-01-15 19:30:18 WARN =C2=A0Client:615 - Exception encountered whil= e connecting to the server : javax.security.sasl.SaslException: GSS initiat= e failed [Caused by GSSException: No valid credentials provided (Mechanism = level: Server not found in Kerberos database (7) - Server not found in Kerb= eros database)]
2014-01-15 19:30:18 ERROR UserGroupInformation:1480 - PriviledgedActio= nException as:hdfs/EXAMPLE.COM@EXAMPLE.COM (auth:KERBEROS) cause:java.io.IOException:= javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSExcep= tion: No valid credentials provided (Mechanism level: Server not found in K= erberos database (7) - Server not found in Kerberos database)]
2014-01-15 19:30:18 DEBUG Client:1099 - closing ipc connection to example.com/127= .0.0.1:8020: javax.security.sasl.SaslException: GSS initiate failed [Ca= used by GSSException: No valid credentials provided (Mechanism level: Serve= r not found in Kerberos database (7) - Server not found in Kerberos databas= e)]
java.io.IOException: javax.security.sasl.SaslException: GSS initiate f= ailed [Caused by GSSException: No valid credentials provided (Mechanism lev= el: Server not found in Kerberos database (7) - Server not found in Kerbero= s database)]
at org.apache.hadoop.ipc.= Client$Connection$1.run(Client.java:620)
at java.security.AccessController.doPrivileged(Native = Method)
at javax.security.auth.Su= bject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupI= nformation.java:1477)
at org.apache.hadoop.ipc.= Client$Connection.handleSaslConnectionFailure(Client.java:583)
at org.apache.hadoop.ipc.Client$= Connection.setupIOstreams(Client.java:667)
at org.apache.hadoop.ipc.= Client$Connection.access$2600(Client.java:314)
at org.apache.hadoop.ipc.Client.getConnection(Cl= ient.java:1399)
at org.apache.hadoop.ipc.= Client.call(Client.java:1318)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.= ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy9.getFileIn= fo(Unknown Source)
at sun.reflect.NativeMeth= odAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMeth= odAccessorImpl.java:57)
at sun.reflect.Delegating= MethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
<= span style=3D"white-space:pre-wrap"> at java.lang.reflect.Method.inv= oke(Method.java:606)
at org.apache.hadoop.io.r= etry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:188)
at org.apache.hadoop.= io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy9.= getFileInfo(Unknown Source)
= at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslat= orPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
at org.apache.hadoop.hdfs= .DFSClient.getFileInfo(DFSClient.java:1636)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.= doCall(DistributedFileSystem.java:1117)
at org.apache.hadoop.hdfs= .DistributedFileSystem$17.doCall(DistributedFileSystem.java:1113)
at org.apache.hadoop.fs.FileS= ystemLinkResolver.resolve(FileSystemLinkResolver.java:78)
at org.apache.hadoop.hdfs= .DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1113)
=
at com.github.elazar.hado= op.examples.HadoopBasicUsage$1.run(HadoopBasicUsage.java:40)
at java.security.AccessCo= ntroller.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.secu= rity.UserGroupInformation.doAs(UserGroupInformation.java:1477)
at com.github.elazar.hadoop.exam= ples.HadoopBasicUsage.run(HadoopBasicUsage.java:34)
at org.apache.hadoop.util= .ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84= )
at com.github.elazar.hado= op.examples.HadoopBasicUsage.main(HadoopBasicUsage.java:18)
at sun.reflect.NativeMethodAccessor= Impl.invoke0(Native Method)
at sun.reflect.NativeMeth= odAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessor= Impl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Meth= od.invoke(Method.java:606)
= at com.intellij.rt.execution.application.AppMain.main(AppMain.java:1= 20)
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Cau= sed by GSSException: No valid credentials provided (Mechanism level: Server= not found in Kerberos database (7) - Server not found in Kerberos database= )]
at com.sun.security.sasl.= gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
<= span style=3D"white-space:pre-wrap"> at org.apache.hadoop.security.S= aslRpcClient.saslConnect(SaslRpcClient.java:394)
at org.apache.hadoop.ipc.= Client$Connection.setupSaslConnection(Client.java:494)
at org.apache.hadoop.ipc.Client$Connecti= on.access$1700(Client.java:314)
at org.apache.hadoop.ipc.= Client$Connection$2.run(Client.java:659)
at org.apache.hadoop.ipc.Client$Connection$2.run(Clien= t.java:655)
at java.security.AccessCo= ntroller.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.secu= rity.UserGroupInformation.doAs(UserGroupInformation.java:1477)
at org.apache.hadoop.ipc.Client$= Connection.setupIOstreams(Client.java:654)
... 32 more
Cau= sed by: GSSException: No valid credentials provided (Mechanism level: Serve= r not found in Kerberos database (7) - Server not found in Kerberos databas= e)
at sun.security.jgss.krb5= .Krb5Context.initSecContext(Krb5Context.java:710)
at sun.security.jgss.GSSContextImpl.initSecCo= ntext(GSSContextImpl.java:248)
at sun.security.jgss.GSSC= ontextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Clie= nt.evaluateChallenge(GssKrb5Client.java:193)
... 41 more
Cau= sed by: KrbException: Server not found in Kerberos database (7) - Server no= t found in Kerberos database
at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
at sun.security.krb5.KrbT= gsReq.getReply(KrbTgsReq.java:192)
at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.jav= a:203)
at sun.security.krb5.inte= rnal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
at sun.security.krb5.internal.Crede= ntialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
at sun.security.krb5.Cred= entials.acquireServiceCreds(Credentials.java:449)
at sun.security.jgss.krb5.Krb5Context.initSec= Context(Krb5Context.java:641)
... 44 more
Cau= sed by: KrbException: Identifier doesn't match expected value (906)
at sun.security.krb5.in= ternal.KDCRep.init(KDCRep.java:143)
at sun.security.krb5.inte= rnal.TGSRep.init(TGSRep.java:66)
at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:= 61)
at sun.security.krb5.KrbT= gsRep.<init>(KrbTgsRep.java:55)
... 50 more
2014-01-15 19:30:18 DEBUG Client:11= 07 - IPC Client (23781497) connection to example.com/127.0.0.1:8020 from hdfs/EXAMPLE.COM@EXAMPL= E.COM: closed

--089e0116167e6c15d504f06545d9--