Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 83B95CF7D for ; Fri, 27 Apr 2012 10:39:09 +0000 (UTC) Received: (qmail 17138 invoked by uid 500); 27 Apr 2012 10:39:08 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 17064 invoked by uid 500); 27 Apr 2012 10:39:08 -0000 Mailing-List: contact hdfs-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-user@hadoop.apache.org Delivered-To: mailing list hdfs-user@hadoop.apache.org Received: (qmail 17056 invoked by uid 99); 27 Apr 2012 10:39:08 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 27 Apr 2012 10:39:08 +0000 X-ASF-Spam-Status: No, hits=-0.0 required=5.0 tests=SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: local policy) Received: from [141.51.167.101] (HELO gundel.cs.uni-kassel.de) (141.51.167.101) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 27 Apr 2012 10:39:00 +0000 Received: from localhost (localhost [127.0.0.1]) by gundel.cs.uni-kassel.de (Postfix) with ESMTP id CD7AB4479EA for ; Fri, 27 Apr 2012 12:38:39 +0200 (CEST) X-Virus-Scanned: Debian amavisd-new at gundel.cs.uni-kassel.de Received: from gundel.cs.uni-kassel.de ([127.0.0.1]) by localhost (gundel.cs.uni-kassel.de [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id wSMduQycmtVw for ; Fri, 27 Apr 2012 12:38:37 +0200 (CEST) Received: from [141.51.123.117] (fudd.cs.uni-kassel.de [141.51.123.117]) by gundel.cs.uni-kassel.de (Postfix) with ESMTPSA id 3168A4479B3 for ; Fri, 27 Apr 2012 12:38:37 +0200 (CEST) Message-ID: <4F9A7749.6090005@cs.uni-kassel.de> Date: Fri, 27 Apr 2012 12:39:05 +0200 From: =?ISO-8859-1?Q?Bj=F6rn-Elmar_Macek?= User-Agent: Mozilla/5.0 (Windows NT 6.1; rv:11.0) Gecko/20120327 Thunderbird/11.0.1 MIME-Version: 1.0 To: hdfs-user@hadoop.apache.org Subject: Re: Hadoop Configuration Issues References: <4F9A6E6E.3040504@cs.uni-kassel.de> <719DAE3B-A8E1-4E1E-A9CC-6C266041AB0C@gmail.com> In-Reply-To: <719DAE3B-A8E1-4E1E-A9CC-6C266041AB0C@gmail.com> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 8bit Hi Alex, as i have written, i already did so! The problem is as already stated in my mail before, that all the Variables ${bla} seem to be UNSET - not only SECURITY_TYPE. As i dont really understand those parameters, i would like to use the default ones, which afaik should be configured in the hadoop-env.sh. But obviously they are not. Best, Bj�rn Am 27.04.2012 12:12, schrieb alo alt: > Hi, > > Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE} > Set it to simple and it should work (default is kerberos). > > - Alex > > -- > Alexander Lorenz > http://mapredit.blogspot.com > > On Apr 27, 2012, at 12:01 PM, Bj�rn-Elmar Macek wrote: > >> Hello, >> >> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down: >> >> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within: >> >> \________ First Error: >> hadoop namenode -format >> Warning: $HADOOP_HOME is deprecated. >> >> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG: >> /************************************************************ >> STARTUP_MSG: Starting NameNode >> STARTUP_MSG: host = ubuntu/127.0.1.1 >> STARTUP_MSG: args = [-format] >> STARTUP_MSG: version = 1.0.1 >> STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012 >> ************************************************************/ >> 12/04/27 11:31:41 INFO util.GSet: VM type = 32-bit >> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB >> 12/04/27 11:31:41 INFO util.GSet: capacity = 2^19 = 524288 entries >> 12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288 >> 12/04/27 11:31:41 ERROR namenode.NameNode: java.lang.IllegalArgumentException: Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE} >> at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202) >> at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187) >> at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239) >> at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438) >> at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424) >> at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473) >> at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:462) >> at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162) >> at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271) >> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288) >> >> 12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG: >> /************************************************************ >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1 >> ************************************************************/ >> >> >> \_________ Solution >> I removed the variable and exchanged it with the value "simple". >> Then the next error occurred: >> >> >> \_________ Error 2 >> hadoop namenode -format >> Warning: $HADOOP_HOME is deprecated. >> >> 12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG: >> /************************************************************ >> STARTUP_MSG: Starting NameNode >> STARTUP_MSG: host = ubuntu/127.0.1.1 >> STARTUP_MSG: args = [-format] >> STARTUP_MSG: version = 1.0.1 >> STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012 >> ************************************************************/ >> 12/04/27 11:46:33 INFO util.GSet: VM type = 32-bit >> 12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB >> 12/04/27 11:46:33 INFO util.GSet: capacity = 2^19 = 524288 entries >> 12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288 >> 12/04/27 11:46:33 ERROR namenode.NameNode: java.lang.ExceptionInInitializerError >> at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212) >> at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187) >> at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239) >> at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438) >> at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424) >> at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473) >> at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:462) >> at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162) >> at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271) >> at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288) >> Caused by: java.util.regex.PatternSyntaxException: Illegal repetition near index 8 >> [jt]t@.*${KERBEROS_REALM} >> ^ >> at java.util.regex.Pattern.error(Pattern.java:1730) >> at java.util.regex.Pattern.closure(Pattern.java:2792) >> at java.util.regex.Pattern.sequence(Pattern.java:1906) >> at java.util.regex.Pattern.expr(Pattern.java:1769) >> at java.util.regex.Pattern.compile(Pattern.java:1477) >> at java.util.regex.Pattern.(Pattern.java:1150) >> at java.util.regex.Pattern.compile(Pattern.java:840) >> at org.apache.hadoop.security.KerberosName$Rule.(KerberosName.java:188) >> at org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324) >> at org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343) >> at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212) >> at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187) >> at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239) >> at org.apache.hadoop.security.KerberosName.(KerberosName.java:83) >> ... 10 more >> >> 12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG: >> /************************************************************ >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1 >> ************************************************************/ >> >> \______ And once again... >> ... a variable seems to be undefined. And i guess, if i found a suitable value for this property the next one will be undefined. I hope there are any default values present, because i have no idea, what to fill to the value slots. All books and instructions i read on hadoop never discussed these issues. >> BTW: HADOOP_HOME is defined, although the log tells different. >> >> I hope you can assist me. >> >> Best regards, >> Bj�rn-Elmar Macek >> >