ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yusaku Sako (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMBARI-8172) Knox Gateway failed to start when Knox service account name was customized
Date Thu, 06 Nov 2014 01:22:33 GMT

    [ https://issues.apache.org/jira/browse/AMBARI-8172?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14199557#comment-14199557
] 

Yusaku Sako commented on AMBARI-8172:
-------------------------------------

[~sumit.gupta]
Thanks for the patch. 
Instead of directly running chown, we should make a call to "Directory".
This wrapper provides abstraction and idempotency (it will create the directory if it does
not already exist, etc) so this is our standard for doing such things.
Please take a look at https://issues.apache.org/jira/secure/attachment/12679709/AMBARI-8173.patch
as an example.



> Knox Gateway failed to start when Knox service account name was customized
> --------------------------------------------------------------------------
>
>                 Key: AMBARI-8172
>                 URL: https://issues.apache.org/jira/browse/AMBARI-8172
>             Project: Ambari
>          Issue Type: Bug
>          Components: stacks
>    Affects Versions: 1.7.0
>            Reporter: Sumit Gupta
>            Assignee: Sumit Gupta
>            Priority: Critical
>             Fix For: 1.7.0
>
>         Attachments: AMBARI-8172.patch, AMBARI-8172.patch.2
>
>
> When the knox service account is changed during the install wizard. Knox fails to start.
For example the 'knox' user was changed to 'knox1'
> {code}
> stderr: 
> 2014-11-05 01:22:19,777 - Error while executing command 'start':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 122, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/KNOX/package/scripts/knox_gateway.py",
line 43, in start
>     self.configure(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/KNOX/package/scripts/knox_gateway.py",
line 37, in configure
>     knox()
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/KNOX/package/scripts/knox.py",
line 63, in knox
>     not_if=format('test -f {knox_master_secret_path}')
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 241, in action_run
>     raise ex
> Fail: Execution of '/usr/hdp/current/knox-server/bin/knoxcli.sh create-master --master
[PROTECTED]' returned 1. log4j:ERROR setFile(null,true) call failed.
> java.io.FileNotFoundException: /usr/hdp/2.2.0.0-1770/knox/bin/../logs/knoxcli.log (Permission
denied)
> 	at java.io.FileOutputStream.open(Native Method)
> 	at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
> 	at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
> 	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
> 	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
> 	at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
> 	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
> 	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
> 	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
> 	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
> 	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
> 	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
> 	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
> 	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:395)
> 	at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:403)
> 	at org.apache.hadoop.gateway.util.KnoxCLI.main(KnoxCLI.java:648)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:70)
> 	at org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:39)
> 	at org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
> 	at org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:69)
> 	at org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:46)
> log4j:ERROR Either File or DatePattern options are not set for appender [drfa].
> This command requires write permissions on the security directory: /usr/hdp/2.2.0.0-1770/knox/bin/../data/security
>  stdout:
> 2014-11-05 01:22:17,433 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;
    curl -kf -x "" --retry 10     http://yusaku1-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment':
..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip',
'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
> 2014-11-05 01:22:17,465 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;
    curl -kf -x "" --retry 10     http://yusaku1-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
> 2014-11-05 01:22:17,466 - Group['nobody'] {'ignore_failures': False}
> 2014-11-05 01:22:17,467 - Modifying group nobody
> 2014-11-05 01:22:17,557 - Group['users1'] {'ignore_failures': False}
> 2014-11-05 01:22:17,557 - Modifying group users1
> 2014-11-05 01:22:17,587 - Group['nagios1'] {'ignore_failures': False}
> 2014-11-05 01:22:17,588 - Modifying group nagios1
> 2014-11-05 01:22:17,622 - Group['nobody1'] {'ignore_failures': False}
> 2014-11-05 01:22:17,622 - Modifying group nobody1
> 2014-11-05 01:22:17,667 - Group['hadoop1'] {'ignore_failures': False}
> 2014-11-05 01:22:17,667 - Modifying group hadoop1
> 2014-11-05 01:22:17,700 - Group['knox1'] {'ignore_failures': False}
> 2014-11-05 01:22:17,701 - Modifying group knox1
> 2014-11-05 01:22:17,726 - User['nobody'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'nobody']}
> 2014-11-05 01:22:17,726 - Modifying user nobody
> 2014-11-05 01:22:17,748 - User['hbase1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:17,749 - Modifying user hbase1
> 2014-11-05 01:22:17,788 - User['tez1'] {'gid': 'hadoop1', 'ignore_failures': False, 'groups':
[u'users1']}
> 2014-11-05 01:22:17,788 - Modifying user tez1
> 2014-11-05 01:22:17,828 - User['yarn1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:17,828 - Modifying user yarn1
> 2014-11-05 01:22:17,868 - User['nobody1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'nobody1']}
> 2014-11-05 01:22:17,869 - Modifying user nobody1
> 2014-11-05 01:22:17,914 - User['oozie1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'users1']}
> 2014-11-05 01:22:17,914 - Modifying user oozie1
> 2014-11-05 01:22:17,940 - User['hive1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:17,941 - Modifying user hive1
> 2014-11-05 01:22:17,955 - User['sqoop1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:17,956 - Modifying user sqoop1
> 2014-11-05 01:22:17,971 - User['kafka1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:17,971 - Modifying user kafka1
> 2014-11-05 01:22:17,986 - User['zookeeper1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:17,986 - Modifying user zookeeper1
> 2014-11-05 01:22:18,001 - User['storm1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,001 - Modifying user storm1
> 2014-11-05 01:22:18,016 - User['nagios1'] {'gid': 'nagios1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,017 - Modifying user nagios1
> 2014-11-05 01:22:18,031 - User['mapred1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,032 - Modifying user mapred1
> 2014-11-05 01:22:18,046 - User['hcat1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,047 - Modifying user hcat1
> 2014-11-05 01:22:18,061 - User['hdfs1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,061 - Modifying user hdfs1
> 2014-11-05 01:22:18,076 - User['flume1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,076 - Modifying user flume1
> 2014-11-05 01:22:18,090 - User['knox1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,091 - Modifying user knox1
> 2014-11-05 01:22:18,105 - User['ambari-qa1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'users1']}
> 2014-11-05 01:22:18,105 - Modifying user ambari-qa1
> 2014-11-05 01:22:18,120 - User['falcon1'] {'gid': 'hadoop1', 'ignore_failures': False,
'groups': [u'hadoop1']}
> 2014-11-05 01:22:18,120 - Modifying user falcon1
> 2014-11-05 01:22:18,134 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-11-05 01:22:18,136 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa1
/tmp/hadoop-ambari-qa1,/tmp/hsperfdata_ambari-qa1,/home/ambari-qa1,/tmp/ambari-qa1,/tmp/sqoop-ambari-qa1
2>/dev/null'] {'not_if': 'test $(id -u ambari-qa1) -gt 1000'}
> 2014-11-05 01:22:18,149 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
ambari-qa1 /tmp/hadoop-ambari-qa1,/tmp/hsperfdata_ambari-qa1,/home/ambari-qa1,/tmp/ambari-qa1,/tmp/sqoop-ambari-qa1
2>/dev/null'] due to not_if
> 2014-11-05 01:22:18,150 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-11-05 01:22:18,150 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase1
/home/hbase1,/tmp/hbase1,/usr/bin/hbase1,/var/log/hbase1,/hadoop/hbase 2>/dev/null'] {'not_if':
'test $(id -u hbase1) -gt 1000'}
> 2014-11-05 01:22:18,164 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
hbase1 /home/hbase1,/tmp/hbase1,/usr/bin/hbase1,/var/log/hbase1,/hadoop/hbase 2>/dev/null']
due to not_if
> 2014-11-05 01:22:18,164 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group':
'root', 'recursive': True}
> 2014-11-05 01:22:18,165 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf',
'to': '/etc/hadoop/conf.empty'}
> 2014-11-05 01:22:18,177 - Skipping Link['/etc/hadoop/conf'] due to not_if
> 2014-11-05 01:22:18,188 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs1'}
> 2014-11-05 01:22:18,200 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test
-f /selinux/enforce'}
> 2014-11-05 01:22:18,225 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'hadoop1',
'mode': 0775, 'recursive': True}
> 2014-11-05 01:22:18,226 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root',
'recursive': True}
> 2014-11-05 01:22:18,227 - Directory['/tmp/hadoop-hdfs1'] {'owner': 'hdfs1', 'recursive':
True}
> 2014-11-05 01:22:18,232 - File['/etc/hadoop/conf/commons-logging.properties'] {'content':
Template('commons-logging.properties.j2'), 'owner': 'hdfs1'}
> 2014-11-05 01:22:18,233 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'),
'owner': 'hdfs1'}
> 2014-11-05 01:22:18,234 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...',
'owner': 'hdfs1', 'group': 'hadoop1', 'mode': 0644}
> 2014-11-05 01:22:18,239 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content':
Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs1'}
> 2014-11-05 01:22:18,239 - File['/etc/hadoop/conf/task-log4j.properties'] {'content':
StaticFile('task-log4j.properties'), 'mode': 0755}
> 2014-11-05 01:22:18,370 - Directory['/etc/knox/conf'] {'owner': 'knox1', 'group': 'knox1',
'recursive': True}
> 2014-11-05 01:22:18,371 - Changing owner for /etc/knox/conf from 0 to knox1
> 2014-11-05 01:22:18,371 - Changing group for /etc/knox/conf from 0 to knox1
> 2014-11-05 01:22:18,371 - XmlConfig['gateway-site.xml'] {'owner': 'knox1', 'group': 'knox1',
'conf_dir': '/etc/knox/conf', 'configuration_attributes': ..., 'configurations': ...}
> 2014-11-05 01:22:18,384 - Generating config: /etc/knox/conf/gateway-site.xml
> 2014-11-05 01:22:18,385 - File['/etc/knox/conf/gateway-site.xml'] {'owner': 'knox1',
'content': InlineTemplate(...), 'group': 'knox1', 'mode': None, 'encoding': 'UTF-8'}
> 2014-11-05 01:22:18,385 - Writing File['/etc/knox/conf/gateway-site.xml'] because contents
don't match
> 2014-11-05 01:22:18,386 - Changing owner for /etc/knox/conf/gateway-site.xml from 0 to
knox1
> 2014-11-05 01:22:18,386 - Changing group for /etc/knox/conf/gateway-site.xml from 0 to
knox1
> 2014-11-05 01:22:18,386 - File['/etc/knox/conf/gateway-log4j.properties'] {'content':
'...', 'owner': 'knox1', 'group': 'knox1', 'mode': 0644}
> 2014-11-05 01:22:18,386 - Writing File['/etc/knox/conf/gateway-log4j.properties'] because
contents don't match
> 2014-11-05 01:22:18,386 - Changing owner for /etc/knox/conf/gateway-log4j.properties
from 0 to knox1
> 2014-11-05 01:22:18,387 - Changing group for /etc/knox/conf/gateway-log4j.properties
from 0 to knox1
> 2014-11-05 01:22:18,393 - File['/etc/knox/conf/topologies/default.xml'] {'content': InlineTemplate(...),
'owner': 'knox1', 'group': 'knox1'}
> 2014-11-05 01:22:18,394 - Writing File['/etc/knox/conf/topologies/default.xml'] because
it doesn't exist
> 2014-11-05 01:22:18,394 - Changing owner for /etc/knox/conf/topologies/default.xml from
0 to knox1
> 2014-11-05 01:22:18,394 - Changing group for /etc/knox/conf/topologies/default.xml from
0 to knox1
> 2014-11-05 01:22:18,395 - Execute['/usr/hdp/current/knox-server/bin/knoxcli.sh create-master
--master [PROTECTED]'] {'environment': ..., 'not_if': 'test -f /var/lib/knox/data/security/master',
'user': 'knox1'}
> 2014-11-05 01:22:19,777 - Error while executing command 'start':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 122, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/KNOX/package/scripts/knox_gateway.py",
line 43, in start
>     self.configure(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/KNOX/package/scripts/knox_gateway.py",
line 37, in configure
>     knox()
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.2/services/KNOX/package/scripts/knox.py",
line 63, in knox
>     not_if=format('test -f {knox_master_secret_path}')
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 241, in action_run
>     raise ex
> Fail: Execution of '/usr/hdp/current/knox-server/bin/knoxcli.sh create-master --master
[PROTECTED]' returned 1. log4j:ERROR setFile(null,true) call failed.
> java.io.FileNotFoundException: /usr/hdp/2.2.0.0-1770/knox/bin/../logs/knoxcli.log (Permission
denied)
> 	at java.io.FileOutputStream.open(Native Method)
> 	at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
> 	at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
> 	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
> 	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
> 	at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
> 	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
> 	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
> 	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
> 	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
> 	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
> 	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
> 	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
> 	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:395)
> 	at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:403)
> 	at org.apache.hadoop.gateway.util.KnoxCLI.main(KnoxCLI.java:648)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:70)
> 	at org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:39)
> 	at org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
> 	at org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:69)
> 	at org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:46)
> log4j:ERROR Either File or DatePattern options are not set for appender [drfa].
> This command requires write permissions on the security directory: /usr/hdp/2.2.0.0-1770/knox/bin/../data/security
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message