Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 2C110200D57 for ; Mon, 11 Dec 2017 12:13:06 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 2A938160C13; Mon, 11 Dec 2017 11:13:06 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 218E0160C10 for ; Mon, 11 Dec 2017 12:13:04 +0100 (CET) Received: (qmail 1351 invoked by uid 500); 11 Dec 2017 11:13:04 -0000 Mailing-List: contact issues-help@ambari.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@ambari.apache.org Delivered-To: mailing list issues@ambari.apache.org Received: (qmail 1342 invoked by uid 99); 11 Dec 2017 11:13:04 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 11 Dec 2017 11:13:04 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id DB7131A036C for ; Mon, 11 Dec 2017 11:13:03 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -99.202 X-Spam-Level: X-Spam-Status: No, score=-99.202 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id PUeKT24ppc4n for ; Mon, 11 Dec 2017 11:13:01 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id D26825F367 for ; Mon, 11 Dec 2017 11:13:00 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 5395FE015F for ; Mon, 11 Dec 2017 11:13:00 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 15F80212FA for ; Mon, 11 Dec 2017 11:13:00 +0000 (UTC) Date: Mon, 11 Dec 2017 11:13:00 +0000 (UTC) From: "Hadoop QA (JIRA)" To: issues@ambari.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (AMBARI-22622) NFSGateway start failing with error : "ERROR: You must be a privileged user in order to run a secure service." MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Mon, 11 Dec 2017 11:13:06 -0000 [ https://issues.apache.org/jira/browse/AMBARI-22622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16285776#comment-16285776 ] Hadoop QA commented on AMBARI-22622: ------------------------------------ {color:red}-1 overall{color}. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12901476/AMBARI-22622.patch against trunk revision . {color:green}+1 @author{color}. The patch does not contain any @author tags. {color:red}-1 tests included{color}. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. {color:green}+1 release audit{color}. The applied patch does not increase the total number of release audit warnings. {color:green}+1 javac{color}. The applied patch does not increase the total number of javac compiler warnings. {color:green}+1 core tests{color}. The patch passed unit tests in ambari-server. Console output: https://builds.apache.org/job/Ambari-trunk-test-patch/12820//console This message is automatically generated. > NFSGateway start failing with error : "ERROR: You must be a privileged user in order to run a secure service." > -------------------------------------------------------------------------------------------------------------- > > Key: AMBARI-22622 > URL: https://issues.apache.org/jira/browse/AMBARI-22622 > Project: Ambari > Issue Type: Bug > Reporter: Andrew Onischuk > Assignee: Andrew Onischuk > Fix For: 3.0.0 > > Attachments: AMBARI-22622.patch > > > > Traceback (most recent call last): > File "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py", line 43, in > BeforeStartHook().execute() > File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 368, in execute > method(env) > File "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py", line 34, in hook > setup_hadoop() > File "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/shared_initialization.py", line 45, in setup_hadoop > cd_access='a', > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ > self.env.run() > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run > self.run_action(resource, action) > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action > provider_action() > File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 185, in action_create > sudo.makedirs(path, self.resource.mode or 0755) > File "/usr/lib/python2.6/site-packages/resource_management/core/sudo.py", line 102, in makedirs > os.makedirs(path, mode) > File "/usr/lib64/python2.7/os.py", line 157, in makedirs > mkdir(name, mode) > OSError: [Errno 17] File exists: '/grid/0/log/hdfs' > Traceback (most recent call last): > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 88, in > NFSGateway().execute() > File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 368, in execute > method(env) > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 53, in start > nfsgateway(action="start") > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py", line 74, in nfsgateway > create_log_dir=True > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py", line 273, in service > Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports) > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ > self.env.run() > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run > self.run_action(resource, action) > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action > provider_action() > File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run > tries=self.resource.tries, try_sleep=self.resource.try_sleep) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner > result = function(command, **kwargs) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call > tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper > result = _call(command, **kwargs_copy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call > raise ExecutionFailed(err_msg, code, out, err) > resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER. > WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS. > ERROR: You must be a privileged user in order to run a secure service. > Traceback (most recent call last): > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 88, in > NFSGateway().execute() > File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 368, in execute > method(env) > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 53, in start > nfsgateway(action="start") > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py", line 74, in nfsgateway > create_log_dir=True > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py", line 273, in service > Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports) > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ > self.env.run() > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run > self.run_action(resource, action) > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action > provider_action() > File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run > tries=self.resource.tries, try_sleep=self.resource.try_sleep) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner > result = function(command, **kwargs) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call > tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper > result = _call(command, **kwargs_copy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call > raise ExecutionFailed(err_msg, code, out, err) > resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER. > WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS. > ERROR: You must be a privileged user in order to run a secure service. > Traceback (most recent call last): > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 88, in > NFSGateway().execute() > File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 368, in execute > method(env) > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 53, in start > nfsgateway(action="start") > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py", line 74, in nfsgateway > create_log_dir=True > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py", line 273, in service > Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports) > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ > self.env.run() > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run > self.run_action(resource, action) > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action > provider_action() > File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run > tries=self.resource.tries, try_sleep=self.resource.try_sleep) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner > result = function(command, **kwargs) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call > tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper > result = _call(command, **kwargs_copy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call > raise ExecutionFailed(err_msg, code, out, err) > resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER. > WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS. > ERROR: You must be a privileged user in order to run a secure service. > Traceback (most recent call last): > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 88, in > NFSGateway().execute() > File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 368, in execute > method(env) > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py", line 53, in start > nfsgateway(action="start") > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py", line 74, in nfsgateway > create_log_dir=True > File "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py", line 273, in service > Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports) > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ > self.env.run() > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run > self.run_action(resource, action) > File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action > provider_action() > File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run > tries=self.resource.tries, try_sleep=self.resource.try_sleep) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner > result = function(command, **kwargs) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call > tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper > result = _call(command, **kwargs_copy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call > raise ExecutionFailed(err_msg, code, out, err) > resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using value of HADOOP_PRIVILEGED_NFS_USER. > WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using value of HADOOP_NFS3_OPTS. > ERROR: You must be a privileged user in order to run a secure service. -- This message was sent by Atlassian JIRA (v6.4.14#64029)