ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hudson (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMBARI-8932) Creating hdfs directories on deploy takes too long, Part 2, reduces deploy time by ~6min
Date Mon, 29 Dec 2014 14:20:13 GMT

    [ https://issues.apache.org/jira/browse/AMBARI-8932?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14260121#comment-14260121
] 

Hudson commented on AMBARI-8932:
--------------------------------

FAILURE: Integrated in Ambari-trunk-Commit #1351 (See [https://builds.apache.org/job/Ambari-trunk-Commit/1351/])
AMBARI-8932. Creating hdfs directories on deploy takes too long, Part 2, reduces deploy time
by ~6min (aonishuk) (aonishuk: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=df9e096f1731d1b0fb3c53e5f306101191f8195c)
* ambari-common/src/main/python/resource_management/libraries/providers/__init__.py
* ambari-common/src/main/python/resource_management/core/source.py
* ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/service_check.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py
* ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py
* ambari-server/src/main/resources/common-services/OOZIE/4.0.0.2.0/package/files/oozieSmoke2.sh
* ambari-server/src/test/python/stacks/2.0.6/HDFS/test_zkfc.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params.py
* ambari-server/src/test/python/stacks/2.0.6/HDFS/test_namenode.py
* ambari-server/src/test/python/stacks/2.0.6/HDFS/test_journalnode.py
* ambari-server/src/test/python/stacks/2.0.6/OOZIE/test_oozie_server.py
* ambari-common/src/main/python/resource_management/libraries/functions/get_namenode_states.py
* ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_snamenode.py
* ambari-server/src/main/resources/stacks/HDP/2.0.6/services/YARN/package/scripts/yarn.py
* ambari-server/src/test/python/stacks/2.0.6/PIG/test_pig_service_check.py
* ambari-common/src/main/python/resource_management/libraries/functions/version.py
* ambari-common/src/main/python/resource_management/libraries/providers/copy_from_local.py
* ambari-agent/src/test/python/resource_management/TestXmlConfigResource.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/files/templetonSmoke.sh
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py
* ambari-server/src/test/python/stacks/2.0.6/YARN/test_nodemanager.py
* ambari-server/src/main/resources/common-services/FALCON/0.5.0.2.1/package/scripts/params.py
* ambari-server/src/test/python/stacks/2.0.6/HBASE/test_hbase_regionserver.py
* ambari-common/src/main/python/resource_management/libraries/resources/__init__.py
* ambari-agent/src/test/python/resource_management/TestCopyFromLocal.py
* ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/files/fast-hdfs-resource.jar
* contrib/fast-hdfs-resource/src/main/java/org/apache/ambari/fast_hdfs_resource/Resource.java
* ambari-common/src/main/python/resource_management/libraries/providers/hdfs_directory.py
* ambari-agent/src/test/python/resource_management/TestPropertiesFileResource.py
* ambari-server/src/main/resources/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie.py
* ambari-common/src/main/python/resource_management/libraries/resources/hdfs_directory.py
* ambari-server/src/main/resources/common-services/FALCON/0.5.0.2.1/package/scripts/falcon.py
* ambari-server/src/test/python/stacks/2.0.6/HDFS/test_datanode.py
* ambari-server/src/test/python/stacks/2.1/FALCON/test_falcon_server.py
* ambari-server/src/test/python/stacks/2.0.6/YARN/test_yarn_client.py
* ambari-common/src/main/python/resource_management/libraries/resources/hdfs_resource.py
* ambari-agent/src/test/python/resource_management/TestRepositoryResource.py
* ambari-server/src/main/resources/stacks/HDP/2.0.6/services/YARN/package/scripts/historyserver.py
* ambari-server/src/main/resources/common-services/HBASE/0.96.0.2.0/package/scripts/hbase.py
* ambari-common/src/main/python/resource_management/libraries/resources/copy_from_local.py
* ambari-server/src/test/python/stacks/2.0.6/HDFS/test_snamenode.py
* ambari-server/src/main/resources/common-services/HBASE/0.96.0.2.0/package/scripts/params.py
* ambari-agent/src/test/python/resource_management/TestContentSources.py
* ambari-server/src/main/resources/common-services/PIG/0.12.0.2.0/package/scripts/params.py
* ambari-server/src/main/resources/common-services/PIG/0.12.0.2.0/package/scripts/service_check.py
* ambari-common/src/main/python/resource_management/libraries/functions/__init__.py
* ambari-server/src/test/python/unitTests.py
* ambari-server/src/test/python/stacks/2.0.6/configs/default.json
* ambari-server/src/test/python/stacks/2.0.6/HIVE/test_hive_server.py
* contrib/fast-hdfs-resource/dependency-reduced-pom.xml
* ambari-server/src/main/resources/stacks/HDP/2.0.6/services/YARN/package/scripts/params.py
* ambari-common/src/main/python/resource_management/libraries/providers/hdfs_resource.py
* ambari-server/src/main/resources/common-services/TEZ/0.4.0.2.1/package/scripts/params.py
* ambari-server/src/main/resources/common-services/OOZIE/4.0.0.2.0/package/scripts/params.py
* ambari-agent/pom.xml
* ambari-server/src/test/python/stacks/2.0.6/HBASE/test_hbase_master.py
* ambari-server/src/test/python/stacks/2.0.6/YARN/test_historyserver.py
* ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py
* ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py
* ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/install_jars.py
* ambari-server/src/test/python/stacks/2.0.6/YARN/test_mapreduce2_client.py
* ambari-server/src/test/python/stacks/2.0.6/HDFS/test_service_check.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
* ambari-server/src/main/resources/common-services/HBASE/0.96.0.2.0/package/scripts/service_check.py


> Creating hdfs directories on deploy takes too long, Part 2, reduces deploy time by ~6min
> ----------------------------------------------------------------------------------------
>
>                 Key: AMBARI-8932
>                 URL: https://issues.apache.org/jira/browse/AMBARI-8932
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 2.0.0
>
>
> Take a look at the webhcat logs, creating dfs directories by calling hadoop binary time
by time takes too long.
>     
>     014-12-10 17:09:29,060 - ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
{'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
'/etc/hadoop/conf'}
>     2014-12-10 17:09:29,073 - Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
{'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:09:46,301 - ls: `hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz': No
such file or directory
>     2014-12-10 17:09:46,301 - HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'security_enabled':
False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local':
'', 'mode': 0555, 'owner': 'hdfs', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action':
['create']}
>     2014-12-10 17:09:46,303 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm
-q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/hive &&
hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/hive &&
hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'not_if':
"/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null ; hadoop
--config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive'", 'user': 'hdfs', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:10:29,989 - CopyFromLocal['/usr/hdp/current/hive-client/hive.tar.gz']
{'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir': 'hdfs:///hdp/apps/2.2.0.0-2041/hive',
'hadoop_conf_dir': '/etc/hadoop/conf', 'mode': 0444}
>     2014-12-10 17:10:30,017 - ExecuteHadoop['fs -copyFromLocal /usr/hdp/current/hive-client/hive.tar.gz
hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c
'export {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin hadoop
fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'", 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:10:48,275 - Execute['hadoop --config /etc/hadoop/conf fs -copyFromLocal
/usr/hdp/current/hive-client/hive.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'logoutput':
False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:07,134 - ExecuteHadoop['fs -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:11:07,135 - Execute['hadoop --config /etc/hadoop/conf fs -chown hdfs:hadoop
hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:16,533 - ExecuteHadoop['fs -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:11:16,534 - Execute['hadoop --config /etc/hadoop/conf fs -chmod 444
hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:29,515 - ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
'/etc/hadoop/conf'}
>     2014-12-10 17:11:29,516 - Execute['hadoop --con014-12-10 17:09:29,060 - ExecuteHadoop['fs
-ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:09:29,073 - Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
{'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:09:46,301 - ls: `hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz': No
such file or directory
>     2014-12-10 17:09:46,301 - HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'security_enabled':
False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local':
'', 'mode': 0555, 'owner': 'hdfs', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action':
['create']}
>     2014-12-10 17:09:46,303 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm
-q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/hive &&
hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/hive &&
hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'not_if':
"/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null ; hadoop
--config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive'", 'user': 'hdfs', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:10:29,989 - CopyFromLocal['/usr/hdp/current/hive-client/hive.tar.gz']
{'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir': 'hdfs:///hdp/apps/2.2.0.0-2041/hive',
'hadoop_conf_dir': '/etc/hadoop/conf', 'mode': 0444}
>     2014-12-10 17:10:30,017 - ExecuteHadoop['fs -copyFromLocal /usr/hdp/current/hive-client/hive.tar.gz
hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c
'export {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin hadoop
fs -ls hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'", 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:10:48,275 - Execute['hadoop --config /etc/hadoop/conf fs -copyFromLocal
/usr/hdp/current/hive-client/hive.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/hive'] {'logoutput':
False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:07,134 - ExecuteHadoop['fs -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:11:07,135 - Execute['hadoop --config /etc/hadoop/conf fs -chown hdfs:hadoop
hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:16,533 - ExecuteHadoop['fs -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:11:16,534 - Execute['hadoop --config /etc/hadoop/conf fs -chmod 444
hdfs:///hdp/apps/2.2.0.0-2041/hive/hive.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:29,515 - ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
'/etc/hadoop/conf'}
>     2014-12-10 17:11:29,516 - Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:45,791 - ls: `hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz': No
such file or directory
>     2014-12-10 17:11:45,791 - HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'security_enabled':
False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local':
'', 'mode': 0555, 'owner': 'hdfs', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action':
['create']}
>     2014-12-10 17:11:45,794 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm
-q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/pig &&
hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/pig &&
hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'not_if':
"/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null ; hadoop
--config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig'", 'user': 'hdfs', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:12:31,703 - CopyFromLocal['/usr/hdp/current/pig-client/pig.tar.gz']
{'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir': 'hdfs:///hdp/apps/2.2.0.0-2041/pig',
'hadoop_conf_dir': '/etc/hadoop/conf', 'mode': 0444}
>     2014-12-10 17:12:31,703 - ExecuteHadoop['fs -copyFromLocal /usr/hdp/current/pig-client/pig.tar.gz
hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
{ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin hadoop fs
-ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'", 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:12:49,508 - Execute['hadoop --config /etc/hadoop/conf fs -copyFromLocal
/usr/hdp/current/pig-client/pig.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'logoutput': False,
'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:09,506 - ExecuteHadoop['fs -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:13:09,507 - Execute['hadoop --config /etc/hadoop/conf fs -chown hdfs:hadoop
hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:18,968 - ExecuteHadoop['fs -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:13:18,969 - Execute['hadoop --config /etc/hadoop/conf fs -chmod 444
hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:32,936 - ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
'/etc/hadoop/conf'}
>     2014-12-10 17:13:32,937 - Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:52,891 - ls: `hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar':
No such file or directory
>     2014-12-10 17:13:52,892 - HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
{'security_enabled': False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'action': ['create']}
>     2014-12-10 17:13:52,904 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm
-q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/mapreduce &&
hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce &&
hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
{'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null
; hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'", 'user':
'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:03,832 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir
`rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/mapreduce
&& hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce
&& hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
due to not_if
>     2014-12-10 17:14:03,833 - CopyFromLocal['/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar']
{'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir': 'hdfs:///hdp/apps/2.2.0.0-2041/mapreduce',
'hadoop_conf_dir': '/etc/hadoop/conf', 'mode': 0444}
>     2014-12-10 17:14:03,836 - ExecuteHadoop['fs -copyFromLocal /usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar
hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash
-c 'export {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin
hadoop fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'", 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:14:12,682 - Execute['hadoop --config /etc/hadoop/conf fs -copyFromLocal
/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
{'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:22,350 - ExecuteHadoop['fs -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:14:22,352 - Execute['hadoop --config /etc/hadoop/conf fs -chown hdfs:hadoop
hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput': False, 'try_sleep':
0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:34,163 - ExecuteHadoop['fs -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:14:34,164 - Execute['hadoop --config /etc/hadoop/conf fs -chmod 444
hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput': False, 'try_sleep':
0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:50,851 - Could not find file: /usr/hdp/current/sqoop-client/sqoop.tar.gz
>     2014-12-10 17:14:50,862 - XmlConfig['webhcat-site.xml'] {'owner': 'hcat', 'group':
'hadoop', 'conf_dir': '/etc/hive-webhcat/conf', 'configuration_attributes': ..., 'configurations':
...}
>     2014-12-10 17:14:50,979 - Generating config: /etc/hive-webhcat/conf/webhcat-site.xml
>     2014-12-10 17:14:50,980 - File['/etc/hive-webhcat/conf/webhcat-site.xml'] {'owner':
'hcat', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
>     2014-12-10 17:14:50,983 - Writing File['/etc/hive-webhcat/conf/webhcat-site.xml']
because it doesn't exist
>     2014-12-10 17:14:51,114 - Changing owner for /etc/hive-webhcat/conf/webhcat-site.xml
from 0 to hcat
>     2014-12-10 17:14:51,169 - Changing group for /etc/hive-webhcat/conf/webhcat-site.xml
from 0 to hadoop
>     2014-12-10 17:14:51,221 - File['/etc/hive-webhcat/conf/webhcat-env.sh'] {'content':
InlineTemplate(...), 'owner': 'hcat', 'group': 'hadoop'}
>     2014-12-10 17:14:51,222 - Writing File['/etc/hive-webhcat/conf/webhcat-env.sh'] because
it doesn't exist
>     2014-12-10 17:14:51,312 - Changing owner for /etc/hive-webhcat/conf/webhcat-env.sh
from 0 to hcat
>     2014-12-10 17:14:51,367 - Changing group for /etc/hive-webhcat/conf/webhcat-env.sh
from 0 to hadoop
>     2014-12-10 17:14:51,423 - Execute['env HADOOP_HOME=/usr/hdp/current/hadoop-client
/usr/hdp/current/hive-webhcat/sbin/webhcat_server.sh start'] {'not_if': 'ls /var/run/webhcat/webhcat.pid
>/dev/null 2>&1 && ps -p `cat /var/run/webhcat/webhcat.pid` >/dev/null
2>&1', 'user': 'hcat'}fig /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:11:45,791 - ls: `hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz': No
such file or directory
>     2014-12-10 17:11:45,791 - HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'security_enabled':
False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local':
'', 'mode': 0555, 'owner': 'hdfs', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action':
['create']}
>     2014-12-10 17:11:45,794 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm
-q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/pig &&
hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/pig &&
hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'not_if':
"/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null ; hadoop
--config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/pig'", 'user': 'hdfs', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:12:31,703 - CopyFromLocal['/usr/hdp/current/pig-client/pig.tar.gz']
{'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir': 'hdfs:///hdp/apps/2.2.0.0-2041/pig',
'hadoop_conf_dir': '/etc/hadoop/conf', 'mode': 0444}
>     2014-12-10 17:12:31,703 - ExecuteHadoop['fs -copyFromLocal /usr/hdp/current/pig-client/pig.tar.gz
hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export
{ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin hadoop fs
-ls hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'", 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:12:49,508 - Execute['hadoop --config /etc/hadoop/conf fs -copyFromLocal
/usr/hdp/current/pig-client/pig.tar.gz hdfs:///hdp/apps/2.2.0.0-2041/pig'] {'logoutput': False,
'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:09,506 - ExecuteHadoop['fs -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:13:09,507 - Execute['hadoop --config /etc/hadoop/conf fs -chown hdfs:hadoop
hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:18,968 - ExecuteHadoop['fs -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:13:18,969 - Execute['hadoop --config /etc/hadoop/conf fs -chmod 444
hdfs:///hdp/apps/2.2.0.0-2041/pig/pig.tar.gz'] {'logoutput': False, 'try_sleep': 0, 'environment':
..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:32,936 - ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir':
'/etc/hadoop/conf'}
>     2014-12-10 17:13:32,937 - Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'logoutput': True, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hcat', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:13:52,891 - ls: `hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar':
No such file or directory
>     2014-12-10 17:13:52,892 - HdfsDirectory['hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
{'security_enabled': False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
'hdfs', 'kinit_path_local': '', 'mode': 0555, 'owner': 'hdfs', 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'action': ['create']}
>     2014-12-10 17:13:52,904 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm
-q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/mapreduce &&
hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce &&
hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
{'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'export {ENV_PLACEHOLDER} > /dev/null
; hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'", 'user':
'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:03,832 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir
`rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` hdfs:///hdp/apps/2.2.0.0-2041/mapreduce
&& hadoop --config /etc/hadoop/conf fs -chmod  555 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce
&& hadoop --config /etc/hadoop/conf fs -chown  hdfs hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
due to not_if
>     2014-12-10 17:14:03,833 - CopyFromLocal['/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar']
{'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'group': 'hadoop', 'hdfs_user': 'hdfs',
'owner': 'hdfs', 'kinnit_if_needed': '', 'dest_dir': 'hdfs:///hdp/apps/2.2.0.0-2041/mapreduce',
'hadoop_conf_dir': '/etc/hadoop/conf', 'mode': 0444}
>     2014-12-10 17:14:03,836 - ExecuteHadoop['fs -copyFromLocal /usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar
hdfs:///hdp/apps/2.2.0.0-2041/mapreduce'] {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash
-c 'export {ENV_PLACEHOLDER} > /dev/null ; PATH=$PATH:/usr/hdp/current/hadoop-client/bin
hadoop fs -ls hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'", 'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:14:12,682 - Execute['hadoop --config /etc/hadoop/conf fs -copyFromLocal
/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar hdfs:///hdp/apps/2.2.0.0-2041/mapreduce']
{'logoutput': False, 'try_sleep': 0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path':
['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:22,350 - ExecuteHadoop['fs -chown hdfs:hadoop hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:14:22,352 - Execute['hadoop --config /etc/hadoop/conf fs -chown hdfs:hadoop
hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput': False, 'try_sleep':
0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:34,163 - ExecuteHadoop['fs -chmod 444 hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar']
{'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}
>     2014-12-10 17:14:34,164 - Execute['hadoop --config /etc/hadoop/conf fs -chmod 444
hdfs:///hdp/apps/2.2.0.0-2041/mapreduce/hadoop-streaming.jar'] {'logoutput': False, 'try_sleep':
0, 'environment': ..., 'tries': 1, 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>     2014-12-10 17:14:50,851 - Could not find file: /usr/hdp/current/sqoop-client/sqoop.tar.gz
>     2014-12-10 17:14:50,862 - XmlConfig['webhcat-site.xml'] {'owner': 'hcat', 'group':
'hadoop', 'conf_dir': '/etc/hive-webhcat/conf', 'configuration_attributes': ..., 'configurations':
...}
>     2014-12-10 17:14:50,979 - Generating config: /etc/hive-webhcat/conf/webhcat-site.xml
>     2014-12-10 17:14:50,980 - File['/etc/hive-webhcat/conf/webhcat-site.xml'] {'owner':
'hcat', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
>     2014-12-10 17:14:50,983 - Writing File['/etc/hive-webhcat/conf/webhcat-site.xml']
because it doesn't exist
>     2014-12-10 17:14:51,114 - Changing owner for /etc/hive-webhcat/conf/webhcat-site.xml
from 0 to hcat
>     2014-12-10 17:14:51,169 - Changing group for /etc/hive-webhcat/conf/webhcat-site.xml
from 0 to hadoop
>     2014-12-10 17:14:51,221 - File['/etc/hive-webhcat/conf/webhcat-env.sh'] {'content':
InlineTemplate(...), 'owner': 'hcat', 'group': 'hadoop'}
>     2014-12-10 17:14:51,222 - Writing File['/etc/hive-webhcat/conf/webhcat-env.sh'] because
it doesn't exist
>     2014-12-10 17:14:51,312 - Changing owner for /etc/hive-webhcat/conf/webhcat-env.sh
from 0 to hcat
>     2014-12-10 17:14:51,367 - Changing group for /etc/hive-webhcat/conf/webhcat-env.sh
from 0 to hadoop
>     2014-12-10 17:14:51,423 - Execute['env HADOOP_HOME=/usr/hdp/current/hadoop-client
/usr/hdp/current/hive-webhcat/sbin/webhcat_server.sh start'] {'not_if': 'ls /var/run/webhcat/webhcat.pid
>/dev/null 2>&1 && ps -p `cat /var/run/webhcat/webhcat.pid` >/dev/null
2>&1', 'user': 'hcat'}
>     



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message