ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hudson (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMBARI-8944) HIVE_METASTORE start failed on Ubuntu12
Date Mon, 29 Dec 2014 20:17:14 GMT

    [ https://issues.apache.org/jira/browse/AMBARI-8944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14260405#comment-14260405
] 

Hudson commented on AMBARI-8944:
--------------------------------

FAILURE: Integrated in Ambari-trunk-Commit-docker #621 (See [https://builds.apache.org/job/Ambari-trunk-Commit-docker/621/])
AMBARI-8944. HIVE_METASTORE start failed on Ubuntu12.(vbrodetskyi) (vbrodetskyi: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=6271c9c41f1a2b49bb6472d3eb86f23c14d714ee)
* ambari-server/src/main/resources/common-services/OOZIE/4.0.0.2.0/package/scripts/params.py
* ambari-server/src/main/resources/stacks/HDP/1.3.2/services/OOZIE/package/scripts/params.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py
* ambari-server/src/main/resources/stacks/HDP/1.3.2/services/OOZIE/package/scripts/oozie.py
* ambari-server/src/main/resources/stacks/HDP/1.3.2/services/HIVE/package/scripts/params.py
* ambari-server/src/main/resources/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie.py
* ambari-server/src/test/python/stacks/2.1/HIVE/test_hive_metastore.py
* ambari-server/src/main/resources/stacks/HDP/1.3.2/services/HIVE/package/scripts/hive.py
* ambari-server/src/test/python/stacks/2.0.6/HIVE/test_hive_metastore.py
* ambari-server/src/main/resources/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_service.py
* ambari-server/src/test/python/stacks/2.0.6/HIVE/test_hive_server.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params.py


> HIVE_METASTORE start failed on Ubuntu12
> ---------------------------------------
>
>                 Key: AMBARI-8944
>                 URL: https://issues.apache.org/jira/browse/AMBARI-8944
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.0.0
>            Reporter: Vitaly Brodetskyi
>            Assignee: Vitaly Brodetskyi
>            Priority: Blocker
>             Fix For: 2.0.0
>
>         Attachments: AMBARI-8944.patch
>
>
> Error output:
> {noformat}
> 2014-12-25 01:38:57,406 - Error while executing command 'start':\nTraceback (most recent
call last):\n  File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 152, in execute\n    method(env)\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py\",
line 47, in start\n    self.configure(env)  # FOR SECURITY\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py\",
line 40, in configure\n    hive(name = 'metastore')\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py\",
line 76, in hive\n    jdbc_connector()\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py\",
line 223, in jdbc_connector\n    sudo=True\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\",
line 148, in __init__\n    self.env.run()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 149, in run\n    self.run_action(resource, action)\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 115, in run_action\n    provider_action()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py\",
line 266, in action_run\n    raise ex\nFail: Execution of 'cp /usr/share/java/mysql-connector-java.jar
/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar' returned 1. cp: not writing
through dangling symlink `/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar'",
>     "stdout" : "2014-12-25 01:38:55,939 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;
    curl -kf -x \"\" --retry 10     http://ambsmoke12-1419466638-9.cs1cloud.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment':
..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip',
'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}\n2014-12-25 01:38:55,944 - Skipping
Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x \"\" --retry
10     http://ambsmoke12-1419466638-9.cs1cloud.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if\n2014-12-25
01:38:55,945 - Group['hadoop'] {'ignore_failures': False}\n2014-12-25 01:38:55,946 - Modifying
group hadoop\n2014-12-25 01:38:56,052 - Group['nobody'] {'ignore_failures': False}\n2014-12-25
01:38:56,053 - Modifying group nobody\n2014-12-25 01:38:56,143 - Group['users'] {'ignore_failures':
False}\n2014-12-25 01:38:56,144 - Modifying group users\n2014-12-25 01:38:56,227 - Group['knox']
{'ignore_failures': False}\n2014-12-25 01:38:56,227 - Modifying group knox\n2014-12-25 01:38:56,310
- User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}\n2014-12-25
01:38:56,311 - Modifying user nobody\n2014-12-25 01:38:56,327 - User['hive'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25 01:38:56,328 - Modifying user
hive\n2014-12-25 01:38:56,343 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'users']}\n2014-12-25 01:38:56,343 - Modifying user oozie\n2014-12-25 01:38:56,360
- User['root'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25
01:38:56,361 - Modifying user root\n2014-12-25 01:38:56,377 - User['ambari-qa'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'users']}\n2014-12-25 01:38:56,377 - Modifying user
ambari-qa\n2014-12-25 01:38:56,393 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}\n2014-12-25 01:38:56,393 - Modifying user hdfs\n2014-12-25 01:38:56,407
- User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25
01:38:56,407 - Modifying user knox\n2014-12-25 01:38:56,419 - User['storm'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25 01:38:56,419 - Modifying user
storm\n2014-12-25 01:38:56,430 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}\n2014-12-25 01:38:56,430 - Modifying user mapred\n2014-12-25 01:38:56,440
- User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25
01:38:56,441 - Modifying user hbase\n2014-12-25 01:38:56,455 - User['tez'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'users']}\n2014-12-25 01:38:56,455 - Modifying user
tez\n2014-12-25 01:38:56,468 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}\n2014-12-25 01:38:56,468 - Modifying user zookeeper\n2014-12-25 01:38:56,484
- User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25
01:38:56,485 - Modifying user falcon\n2014-12-25 01:38:56,499 - User['sqoop'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25 01:38:56,500 - Modifying user
sqoop\n2014-12-25 01:38:56,516 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}\n2014-12-25 01:38:56,517 - Modifying user yarn\n2014-12-25 01:38:56,533
- User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2014-12-25
01:38:56,533 - Modifying user hcat\n2014-12-25 01:38:56,548 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}\n2014-12-25 01:38:56,562 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
{'not_if': 'test $(id -u ambari-qa) -gt 1000'}\n2014-12-25 01:38:56,570 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
due to not_if\n2014-12-25 01:38:56,571 - Directory['/grid/0/hadoop/hbase'] {'owner': 'hbase',
'recursive': True, 'recursive_permission': True, 'mode': 0775}\n2014-12-25 01:38:56,572 -
File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'),
'mode': 0555}\n2014-12-25 01:38:56,586 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase'] {'not_if':
'test $(id -u hbase) -gt 1000'}\n2014-12-25 01:38:56,592 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase'] due to not_if\n2014-12-25
01:38:56,593 - Directory['/etc/hadoop'] {'mode': 0755}\n2014-12-25 01:38:56,593 - Directory['/etc/hadoop/conf.empty']
{'owner': 'hdfs', 'group': 'hadoop', 'recursive': True}\n2014-12-25 01:38:56,594 - Link['/etc/hadoop/conf']
{'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}\n2014-12-25 01:38:56,598
- Skipping Link['/etc/hadoop/conf'] due to not_if\n2014-12-25 01:38:56,613 - File['/etc/hadoop/conf/hadoop-env.sh']
{'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}\n2014-12-25 01:38:56,639
- Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}\n2014-12-25
01:38:56,642 - Skipping Execute['/bin/echo 0 > /selinux/enforce'] due to only_if\n2014-12-25
01:38:56,643 - Directory['/grid/0/log/hadoop'] {'owner': 'root', 'group': 'hadoop', 'mode':
0775, 'recursive': True}\n2014-12-25 01:38:56,644 - Directory['/var/run/hadoop'] {'owner':
'root', 'group': 'root', 'recursive': True}\n2014-12-25 01:38:56,644 - Directory['/tmp/hadoop-hdfs']
{'owner': 'hdfs', 'recursive': True}\n2014-12-25 01:38:56,649 - File['/etc/hadoop/conf/commons-logging.properties']
{'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}\n2014-12-25 01:38:56,664
- File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner':
'hdfs'}\n2014-12-25 01:38:56,677 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...',
'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}\n2014-12-25 01:38:56,708 - File['/etc/hadoop/conf/hadoop-metrics2.properties']
{'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}\n2014-12-25 01:38:56,720
- File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'),
'mode': 0755}\n2014-12-25 01:38:56,945 - Directory['/etc/hive'] {'mode': 0755}\n2014-12-25
01:38:56,946 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive':
True}\n2014-12-25 01:38:56,947 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir':
'/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations':
...}\n2014-12-25 01:38:56,960 - Generating config: /etc/hive/conf.server/mapred-site.xml\n2014-12-25
01:38:56,960 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content':
InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}\n2014-12-25 01:38:57,005
- Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match\n2014-12-25
01:38:57,030 - File['/etc/hive/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group':
'hadoop'}\n2014-12-25 01:38:57,031 - File['/etc/hive/conf.server/hive-env.sh.template'] {'owner':
'hive', 'group': 'hadoop'}\n2014-12-25 01:38:57,032 - File['/etc/hive/conf.server/hive-exec-log4j.properties']
{'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}\n2014-12-25 01:38:57,047
- File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...', 'owner': 'hive',
'group': 'hadoop', 'mode': 0644}\n2014-12-25 01:38:57,061 - Directory['/etc/hive/conf'] {'owner':
'hive', 'group': 'hadoop', 'recursive': True}\n2014-12-25 01:38:57,062 - XmlConfig['mapred-site.xml']
{'group': 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 'configuration_attributes':
..., 'owner': 'hive', 'configurations': ...}\n2014-12-25 01:38:57,079 - Generating config:
/etc/hive/conf/mapred-site.xml\n2014-12-25 01:38:57,079 - File['/etc/hive/conf/mapred-site.xml']
{'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding':
'UTF-8'}\n2014-12-25 01:38:57,124 - Writing File['/etc/hive/conf/mapred-site.xml'] because
contents don't match\n2014-12-25 01:38:57,147 - File['/etc/hive/conf/hive-default.xml.template']
{'owner': 'hive', 'group': 'hadoop'}\n2014-12-25 01:38:57,148 - File['/etc/hive/conf/hive-env.sh.template']
{'owner': 'hive', 'group': 'hadoop'}\n2014-12-25 01:38:57,149 - File['/etc/hive/conf/hive-exec-log4j.properties']
{'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}\n2014-12-25 01:38:57,164
- File['/etc/hive/conf/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group':
'hadoop', 'mode': 0644}\n2014-12-25 01:38:57,178 - XmlConfig['hive-site.xml'] {'group': 'hadoop',
'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner':
'hive', 'configurations': ...}\n2014-12-25 01:38:57,193 - Generating config: /etc/hive/conf.server/hive-site.xml\n2014-12-25
01:38:57,194 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...),
'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}\n2014-12-25 01:38:57,306 - Writing File['/etc/hive/conf.server/hive-site.xml']
because it doesn't exist\n2014-12-25 01:38:57,326 - Changing owner for /etc/hive/conf.server/hive-site.xml
from 0 to hive\n2014-12-25 01:38:57,344 - File['/etc/hive/conf.server/hive-env.sh'] {'content':
InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}\n2014-12-25 01:38:57,345 - Writing
File['/etc/hive/conf.server/hive-env.sh'] because it doesn't exist\n2014-12-25 01:38:57,372
- Changing owner for /etc/hive/conf.server/hive-env.sh from 0 to hive\n2014-12-25 01:38:57,387
- Execute['('cp', '/usr/share/java/mysql-connector-java.jar', '/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar')']
{'creates': '/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar', 'path': ['/bin',
'/usr/bin/'], 'sudo': True, 'not_if': 'test -f /usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar'}\n2014-12-25
01:38:57,406 - Error while executing command 'start':\nTraceback (most recent call last):\n
 File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 152, in execute\n    method(env)\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py\",
line 47, in start\n    self.configure(env)  # FOR SECURITY\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py\",
line 40, in configure\n    hive(name = 'metastore')\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py\",
line 76, in hive\n    jdbc_connector()\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py\",
line 223, in jdbc_connector\n    sudo=True\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\",
line 148, in __init__\n    self.env.run()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 149, in run\n    self.run_action(resource, action)\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 115, in run_action\n    provider_action()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py\",
line 266, in action_run\n    raise ex\nFail: Execution of 'cp /usr/share/java/mysql-connector-java.jar
/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar' returned 1. cp: not writing
through dangling symlink `/usr/hdp/current/hive-metastore/lib/mysql-connector-java.jar'
> {noformat}
>  *_All clusters logs are in attachment_*



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message