ambari-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Matt (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMBARI-18837) HAWQ Master fails to start when webhdfs is disabled
Date Wed, 09 Nov 2016 21:02:58 GMT

    [ https://issues.apache.org/jira/browse/AMBARI-18837?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15652031#comment-15652031
] 

Matt commented on AMBARI-18837:
-------------------------------

Committed to trunk:
{code}
commit f33dd9e3821facb4094966d15c6c8549d5ec95a3
Author: Matt <mmathew@pivotal.io>
Date:   Wed Nov 9 13:01:11 2016 -0800
{code}
Committed to branch-2.5:
{code}
commit c503f7c9cabd00f760366536e1d1c1c513186eca
Author: Matt <mmathew@pivotal.io>
Date:   Wed Nov 9 13:01:38 2016 -0800
{code}
Committed to branch-2.4:
{code}
commit fb1064bd2c7ad7de22a3e7a13a78490be2959ff3
Author: Matt <mmathew@pivotal.io>
Date:   Wed Nov 9 13:02:03 2016 -0800
{code}
Marking as resolved

> HAWQ Master fails to start when webhdfs is disabled
> ---------------------------------------------------
>
>                 Key: AMBARI-18837
>                 URL: https://issues.apache.org/jira/browse/AMBARI-18837
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Matt
>            Assignee: Matt
>             Fix For: trunk, 2.5.0, 2.4.2
>
>         Attachments: AMBARI-18837-orig.patch
>
>
> The HdfsResource is missing hadoop_conf_dir and hadoop_bin_dir parameters which are required
when webhdfs is not enabled.
> {code}
> stderr:   /var/lib/ambari-agent/data/errors-22205.txt
> Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HAWQ/2.0.0/package/scripts/hawqmaster.py",
line 98, in <module>
>     HawqMaster().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 219, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HAWQ/2.0.0/package/scripts/hawqmaster.py",
line 57, in start
>     common.start_component(hawq_constants.MASTER, params.hawq_master_address_port, params.hawq_master_dir)
>   File "/var/lib/ambari-agent/cache/common-services/HAWQ/2.0.0/package/scripts/common.py",
line 292, in start_component
>     params.HdfsResource(None, action="execute")
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
160, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
124, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
line 469, in action_execute
>     self.get_hdfs_resource_executor().action_execute(self)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
line 124, in action_execute
>     logoutput=logoutput,
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
160, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
124, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 238, in action_run
>     tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70,
in inner
>     result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92,
in checked_call
>     tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140,
in _call_wrapper
>     result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 180,
in _call
>     path = os.pathsep.join(path) if isinstance(path, (list, tuple)) else path
> TypeError: sequence item 0: expected string, NoneType found
> stdout:   /var/lib/ambari-agent/data/output-22205.txt
> 2016-11-02 19:27:04,973 - HdfsResource['/hawq_default'] {'security_enabled': False, 'keytab':
[EMPTY], 'default_fs': 'hdfs://CentralPerk2NNSrvc', 'hdfs_site': ..., 'kinit_path_local':
'/usr/bin/kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'recursive_chown': True, 'owner':
'gpadmin', 'group': 'gpadmin', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths':
[u'/data/hive/databases', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode':
0755}
> 2016-11-02 19:27:04,978 - HdfsResource[None] {'security_enabled': False, 'keytab': [EMPTY],
'default_fs': 'hdfs://CentralPerk2NNSrvc', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit',
'principal_name': [EMPTY], 'user': 'hdfs', 'action': ['execute'], 'immutable_paths': [u'/data/hive/databases',
u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon']}
> 2016-11-02 19:27:04,980 - File['/var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json']
{'content': '[{"group": "gpadmin", "target": "/hawq_default", "action": "create", "manageIfExists":
true, "mode": "755", "owner": "gpadmin", "type": "directory", "recursiveChown": true}]', 'owner':
'hdfs'}
> 2016-11-02 19:27:04,982 - Writing File['/var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json']
because it doesn't exist
> 2016-11-02 19:27:04,983 - Changing owner for /var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json
from 0 to hdfs
> 2016-11-02 19:27:04,984 - Execute['hadoop --config None jar /var/lib/ambari-agent/lib/fast-hdfs-resource.jar
/var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json'] {'logoutput': None, 'path':
[None], 'user': 'hdfs'}
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message