ambari-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrew Onischuk (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-22622) NFSGateway start failing with error : "ERROR: You must be a privileged user in order to run a secure service."
Date Mon, 11 Dec 2017 10:39:03 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-22622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Andrew Onischuk updated AMBARI-22622:
-------------------------------------
    Status: Patch Available  (was: Open)

> NFSGateway start failing with error : "ERROR: You must be a privileged user in order
to run a secure service."
> --------------------------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-22622
>                 URL: https://issues.apache.org/jira/browse/AMBARI-22622
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 3.0.0
>
>         Attachments: AMBARI-22622.patch
>
>
>     
>     Traceback (most recent call last):
>       File "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py", line
43, in <module>
>         BeforeStartHook().execute()
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 368, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py", line
34, in hook
>         setup_hadoop()
>       File "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/shared_initialization.py",
line 45, in setup_hadoop
>         cd_access='a',
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
166, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 160, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 124, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 185, in action_create
>         sudo.makedirs(path, self.resource.mode or 0755)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/sudo.py", line
102, in makedirs
>         os.makedirs(path, mode)
>       File "/usr/lib64/python2.7/os.py", line 157, in makedirs
>         mkdir(name, mode)
>     OSError: [Errno 17] File exists: '/grid/0/log/hdfs'
>     Traceback (most recent call last):
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 88, in <module>
>         NFSGateway().execute()
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 368, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 53, in start
>         nfsgateway(action="start")
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
line 74, in nfsgateway
>         create_log_dir=True
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
166, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 160, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 124, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
72, in inner
>         result = function(command, **kwargs)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
102, in checked_call
>         tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh
su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config
/usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER
has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.
>     Traceback (most recent call last):
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 88, in <module>
>         NFSGateway().execute()
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 368, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 53, in start
>         nfsgateway(action="start")
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
line 74, in nfsgateway
>         create_log_dir=True
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
166, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 160, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 124, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
72, in inner
>         result = function(command, **kwargs)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
102, in checked_call
>         tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh
su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config
/usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER
has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.
>     Traceback (most recent call last):
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 88, in <module>
>         NFSGateway().execute()
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 368, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 53, in start
>         nfsgateway(action="start")
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
line 74, in nfsgateway
>         create_log_dir=True
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
166, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 160, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 124, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
72, in inner
>         result = function(command, **kwargs)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
102, in checked_call
>         tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh
su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config
/usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER
has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.
>     Traceback (most recent call last):
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 88, in <module>
>         NFSGateway().execute()
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 368, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
line 53, in start
>         nfsgateway(action="start")
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
line 74, in nfsgateway
>         create_log_dir=True
>       File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
166, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 160, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 124, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
72, in inner
>         result = function(command, **kwargs)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
102, in checked_call
>         tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh
su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config
/usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER
has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message