Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8A84E17AB2 for ; Mon, 1 Jun 2015 11:31:49 +0000 (UTC) Received: (qmail 34566 invoked by uid 500); 1 Jun 2015 11:31:43 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 34462 invoked by uid 500); 1 Jun 2015 11:31:43 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 34452 invoked by uid 99); 1 Jun 2015 11:31:43 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 01 Jun 2015 11:31:43 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 16C9D1A4170 for ; Mon, 1 Jun 2015 11:31:43 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3 X-Spam-Level: *** X-Spam-Status: No, score=3 tagged_above=-999 required=6.31 tests=[HTML_MESSAGE=3, SPF_PASS=-0.001, URIBL_BLOCKED=0.001] autolearn=disabled Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id zIxhaZdGRUYr for ; Mon, 1 Jun 2015 11:31:27 +0000 (UTC) Received: from HJ-SMTP-OUT.persistent.co.in (hjoutgoing.persistent.co.in [103.6.33.101]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with ESMTPS id 98B07253F9 for ; Mon, 1 Jun 2015 11:31:25 +0000 (UTC) X-AuditID: 0a2d0811-b7f416d000001a48-a4-556c42834c33 Received: from mail.persistent.co.in (Unknown_Domain [10.44.252.65]) (using TLS with cipher AES128-SHA (128/128 bits)) (Client did not present a certificate) by HJ-SMTP-OUT.persistent.co.in (HJ-SMTP-OUT @ Persistent Systems Ltd.) with SMTP id D1.51.06728.3824C655; Mon, 1 Jun 2015 17:01:15 +0530 (IST) Received: from HJ-MBX1.persistent.co.in ([169.254.1.73]) by HJ-HTCAS2.persistent.co.in ([10.44.252.65]) with mapi id 14.03.0123.003; Mon, 1 Jun 2015 17:01:15 +0530 From: Pratik Gadiya To: "user@hadoop.apache.org" Subject: Hive Metastore Service Startup Fails Thread-Topic: Hive Metastore Service Startup Fails Thread-Index: AdCcXnWfHD1fXSfFRHebXg2FAppOBg== Importance: high X-Priority: 1 Date: Mon, 1 Jun 2015 11:31:14 +0000 Message-ID: Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [10.45.0.30] Content-Type: multipart/alternative; boundary="_000_CF50A2FCABF81D4E8458A9C95E30342EDFD6D533HJMBX1persisten_" MIME-Version: 1.0 X-Brightmail-Tracker: H4sIAAAAAAAAA+NgFjrIKsWRmVeSWpSXmKPExsXCpfPHUbfZKSfU4HybtUXPlGksDoweE7q2 MAYwRjUw2iTm5eWXJJakKqSkFifbKvmkpifm6LpkFifnJGbmphYpKWSm2CoZKSkU5CQmp+am 5pXYKiUWFKTmpSjZcSlgABugssw8hdS85PyUzLx0WyXPYH9dCwtTS11DJTsXz2BnH0dPX9cg NWVDY2suWyBI2Mma8WzzbJaCf78ZK+6fmM/ewDjvKWMXIyeHhICJxK1pU1ggbDGJC/fWs3Ux cnEICWxhkth+qZMdwlnOKNG8cCFYB5uAkcTXpglMXYwcHCICphI9T3VBwsICuhI3J/9jB7FF gEquHHnPAmHrSWw6vhTM5hQQkPjVPgtqGa/ElLknwepZBFQkJm+8CWbzCvhLrF12EKyGEeig 76fWMIHYzALiEreezGeC6BWQWLLnPDOELSrx8vE/VpBzJATkJKZNy4Yoz5d4/H8yC8RIQYmT M5+wTGAUmYVk0iwkZbOQlEHEdSQW7P7EBmFrSyxb+JoZxj5z4DETsvgCRvZVjDIeXrrBviEB uv6hIXoFqUXFmcUlwGjUS87Xy8zbxAhKIbocgjsYe/brH2IU4GBU4uENYcgJFWJNLCuuzD3E KMHBrCTCy2QLFOJNSaysSi3Kjy8qzUktPsQYBAyficxS3Mn5eSAj440NDIjkKInzprNYhgoJ pAOTXnZqKtBdMEOZODilGhiZbutMv/l0q0/7vqtZ03YfkHWe7D6lhOuFyGGuSGfFp097j0rs yTQ/teVAjd7WklC9vd86Q9gCfAvPOemFdx+IXXenTclcQUVTlCHNY8a3m+d+/5dV9t1+KS6Z Ye3jxb6NoTOuFJfrFmfYObHF3bNra+Z4z+p96J+tT4/WHrl7r4Jn/ldf6q3EUpyRaKjFXFSc CAAZEVsCVAMAAA== --_000_CF50A2FCABF81D4E8458A9C95E30342EDFD6D533HJMBX1persisten_ Content-Type: text/plain; charset="us-ascii" content-transfer-encoding: quoted-printable Hello All, When I try to deploy hortonworks cluster using ambari blueprint APIs, it res= ults in failure while starting up of Hive Metastore service. The same blueprint most of the times works appropriately on the same environ= ment. The parameter which gets changed in the entire blueprint w.r.t hive is, Host Mapping File Content: {'blueprint': 'onemasterblueprint', 'configurations': [{u'hive-env': {u'hive_metastore_user_passwd': 'tkdw1rN&'}= }, {u'gateway-site': {u'gateway.port': u'8445'}}, {u'nagios-env': {u'nagios_contact': u'abc@us.ibm.com'}}, {u'hive-site': {u'javax.jdo.option.ConnectionPassword':= 'tkdw1rN&'}}, {'hdfs-site': {'dfs.datanode.data.dir': '/disk1/hadoop/h= dfs/data,/disk2/hadoop/hdfs/data', 'dfs.namenode.checkpoint.dir': '/disk1/ha= doop/hdfs/namesecondary', 'dfs.namenode.name.dir': '/disk1/hadoop/h= dfs/namenode'}}, {'core-site': {'fs.swift.impl': 'org.apache.hadoop.fs.sw= ift.snative.SwiftNativeFileSystem', 'fs.swift.service.softlayer.auth.url': 'h= ttps://dal05.objectstorage.service.networklayer.com/auth/v1.0', 'fs.swift.service.softlayer.connect.timeo= ut': '120000', 'fs.swift.service.softlayer.public': 'fal= se', 'fs.swift.service.softlayer.use.encryptio= n': 'true', 'fs.swift.service.softlayer.use.get.auth'= : 'true'}}], 'default_password': 'tkdw1rN&', 'host_groups': [{'hosts': [{'fqdn': 'vmktest0003.test.analytics.com'}], 'name': 'master'}, {'hosts': [{'fqdn': 'vmktest0004.test.analytics.com'}], 'name': 'compute'}]} Error.txt: 2015-06-01 05:59:22,178 - Error while executing command 'start': Traceback (most recent call last): File "/usr/lib/python2.6/site-packages/resource_management/libraries/scrip= t/script.py", line 123, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/s= cripts/hive_metastore.py", line 43, in start self.configure(env) # FOR SECURITY File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/s= cripts/hive_metastore.py", line 38, in configure hive(name=3D'metastore') File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/s= cripts/hive.py", line 97, in hive not_if =3D check_schema_created_cmd File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",= line 148, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environmen= t.py", line 149, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environmen= t.py", line 115, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/= system.py", line 241, in action_run raise ex Fail: Execution of 'export HIVE_CONF_DIR=3D/etc/hive/conf.server ; /usr/hdp/= current/hive-client/bin/schematool -initSchema -dbType mysql -userName hive= -passWord [PROTECTED]' returned 1. 15/06/01 05:59:21 WARN conf.HiveConf: Hi= veConf of name hive.optimize.mapjoin.mapreduce does not exist 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.heapsize does no= t exist 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.server2.enable.i= mpersonation does not exist 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.auto.convert.sor= tmerge.join.noconditionaltask does not exist Metastore connection URL: jdbc:mysql://vmktest0009.test.analytics.ibmcloud.c= om/hive?createDatabaseIfNotExist=3Dtrue Metastore Connection Driver : com.mysql.jdbc.Driver Metastore connection User: hive org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema ver= sion. *** schemaTool failed *** Output.txt: 2015-06-01 05:59:07,907 - Changing permission for /var/lib/ambari-agent/data= /tmp/start_metastore_script from 644 to 755 2015-06-01 05:59:07,909 - Execute['export HIVE_CONF_DIR=3D/etc/hive/conf.ser= ver ; /usr/hdp/current/hive-client/bin/schematool -initSchema -dbType mysql= -userName hive -passWord [PROTECTED]'] {'not_if': 'export HIVE_CONF_DIR=3D/= etc/hive/conf.server ; /usr/hdp/current/hive-client/bin/schematool -info -db= Type mysql -userName hive -passWord \'Hb2\'"\'"\'aasz\''} 2015-06-01 05:59:22,178 - Error while executing command 'start': Traceback (most recent call last): File "/usr/lib/python2.6/site-packages/resource_management/libraries/scrip= t/script.py", line 123, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/s= cripts/hive_metastore.py", line 43, in start self.configure(env) # FOR SECURITY File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/s= cripts/hive_metastore.py", line 38, in configure hive(name=3D'metastore') File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/s= cripts/hive.py", line 97, in hive not_if =3D check_schema_created_cmd File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",= line 148, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environmen= t.py", line 149, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environmen= t.py", line 115, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/= system.py", line 241, in action_run raise ex Fail: Execution of 'export HIVE_CONF_DIR=3D/etc/hive/conf.server ; /usr/hdp/= current/hive-client/bin/schematool -initSchema -dbType mysql -userName hive= -passWord [PROTECTED]' returned 1. 15/06/01 05:59:21 WARN conf.HiveConf: Hi= veConf of name hive.optimize.mapjoin.mapreduce does not exist 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.heapsize does no= t exist 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.server2.enable.i= mpersonation does not exist 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.auto.convert.sor= tmerge.join.noconditionaltask does not exist Metastore connection URL: jdbc:mysql://vmktest0009.test.analytics.ibmcloud.= com/hive?createDatabaseIfNotExist=3Dtrue Metastore Connection Driver : com.mysql.jdbc.Driver Metastore connection User: hive org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema ver= sion. *** schemaTool failed *** Is there any constraint w.r.t setting up of passwords in ambari. Please let me know how can I resolve this error so that I can automate the s= ame in the deployment. With Regards, Pratik Gadiya DISCLAIMER =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D This e-mail may contain privileged and confidential information which is the= property of Persistent Systems Ltd. It is intended only for the use of the= individual or entity to which it is addressed. If you are not the intended= recipient, you are not authorized to read, retain, copy, print, distribute= or use this message. If you have received this communication in error, plea= se notify the sender and delete all copies of this message. Persistent Syste= ms Ltd. does not accept any liability for virus infected mails. --_000_CF50A2FCABF81D4E8458A9C95E30342EDFD6D533HJMBX1persisten_ Content-Type: text/html; charset="us-ascii" content-transfer-encoding: quoted-printable

Hello All,

 

When I try to deploy hortonworks cluster using ambari= blueprint APIs, it results in failure while starting up of Hive Metastore service.

 

The same blueprint most of the times works appropriat= ely on the same environment.

 

The parameter which gets changed in the entire bluepr= int w.r.t hive is,

 

Host Mapping File Content:

{'blueprint': 'onemasterblueprint',

'configurations': [{u'hive-env': {u'hive_me= tastore_user_passwd': 'tkdw1rN&'}},

         = ;           {u'gateway-sit= e': {u'gateway.port': u'8445'}},

         = ;           {u'nagios-env'= : {u'nagios_contact': u'abc@us.ibm.com'}},

         = ;           {u'hive-sit= e': {u'javax.jdo.option.ConnectionPassword': 'tkdw1rN&= '}},

         = ;           {'hdfs-site':= {'dfs.datanode.data.dir': '/disk1/hadoop/hdfs/data,/disk2/hadoop/hdfs/data'= ,

         = ;            &nb= sp;            = 'dfs.namenode.checkpoint.dir': '/disk1/hadoop/hdfs/namesecondary',

         = ;            &nb= sp;            = 'dfs.namenode.name.dir': '/disk1/hadoop/hdfs/namenode'}},

         = ;           {'core-site':= {'fs.swift.impl': 'org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem= ',

         = ;            &nb= sp;            = 'fs.swift.service.softlayer.auth.url': 'https://dal05.objectstorage.service= .networklayer.com/auth/v1.0',

         = ;            &nb= sp;            = 'fs.swift.service.softlayer.connect.timeout': '120000',

         = ;            &nb= sp;            = 'fs.swift.service.softlayer.public': 'false',

         = ;            &nb= sp;            = 'fs.swift.service.softlayer.use.encryption': 'true',

         = ;            &nb= sp;            = 'fs.swift.service.softlayer.use.get.auth': 'true'}}],

'default_password': 'tkdw1rN&',

'host_groups': [{'hosts': [{'fqdn': 'vmktest0003.test= .analytics.com'}],

         = ;         'name': 'master'},

         = ;        {'hosts': [{'fqdn': 'vmktest0004= .test.analytics.com'}],

         = ;         'name': 'compute'}]}<= /o:p>

 

Error.txt:

2015-06-01 05:59:22,178 - Error while executing command 'start':<= /o:p>

Traceback (most recent call last):

  File "/usr/lib/python2.6/site-packages/resource_management= /libraries/script/script.py", line 123, in execute

    method(env)

  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/service= s/HIVE/package/scripts/hive_metastore.py", line 43, in start=

    self.configure(env) # FOR SECURITY

  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/service= s/HIVE/package/scripts/hive_metastore.py", line 38, in configure

    hive(name=3D'metastore')

  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/service= s/HIVE/package/scripts/hive.py", line 97, in hive

    not_if =3D check_schema_created_cmd

  File "/usr/lib/python2.6/site-packages/resource_management= /core/base.py", line 148, in __init__

    self.env.run()

  File "/usr/lib/python2.6/site-packages/resource_management= /core/environment.py", line 149, in run

    self.run_action(resource, action)=

  File "/usr/lib/python2.6/site-packages/resource_management= /core/environment.py", line 115, in run_action

    provider_action()

  File "/usr/lib/python2.6/site-packages/resource_management= /core/providers/system.py", line 241, in action_run

    raise ex

Fail: Execution of 'export HIVE_CONF_DIR=3D/etc/hive/conf.server ; /us= r/hdp/current/hive-client/bin/schematool -initSchema -dbType mysql -userName= hive -passWord [PROTECTED]' returned 1. 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.optimize.mapjoin.mapredu= ce does not exist

15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.heapsize d= oes not exist

15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.server2.en= able.impersonation does not exist

15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of name hive.auto.conve= rt.sortmerge.join.noconditionaltask does not exist

Metastore connection URL: jdbc:mysql://vmktest0009.test.analytics.ibmc= loud.com/hive?createDatabaseIfNotExist=3Dtrue

Metastore Connection Driver :      com.mysql.= jdbc.Driver

Metastore connection User:       &n= bsp; hive

org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get sche= ma version.

*** schemaTool failed ***

 

Output.txt:

 

2015-06-0=
1 05:59:07,907 - Changing permission for /var/lib/ambari-agent/data/tmp/star=
t_metastore_script from 644 to 755
2015-06-0=
1 05:59:07,909 - Execute['export HIVE_CONF_DIR=3D/etc/hive/conf.server ; /us=
r/hdp/current/hive-client/bin/schematool -initSchema -dbType mysql -userName=
 hive -passWord [PROTECTED]'] {'not_if': 'export HIVE_CONF_DIR=3D/etc/hive/c=
onf.server ; /usr/hdp/current/hive-client/bin/schematool -info -dbType mysql=
 -userName hive -passWord \'Hb2\'"\'"\'aasz\''}<=
/pre>
2015-06-0=
1 05:59:22,178 - Error while executing command 'start':
Traceback=
 (most recent call last):
  Fi=
le "/usr/lib/python2.6/site-packages/resource_management/libraries/scri=
pt/script.py", line 123, in execute
 &nb=
sp;  method(env)
  Fi=
le "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/=
scripts/hive_metastore.py", line 43, in start
 &nb=
sp;  self.configure(env) # FOR SECURITY
  Fi=
le "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/=
scripts/hive_metastore.py", line 38, in configure
 &nb=
sp;  hive(name=3D'metastore')
  Fi=
le "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/=
scripts/hive.py", line 97, in hive
 &nb=
sp;  not_if =3D check_schema_created_cmd
  Fi=
le "/usr/lib/python2.6/site-packages/resource_management/core/base.py&q=
uot;, line 148, in __init__
 &nb=
sp;  self.env.run()
  Fi=
le "/usr/lib/python2.6/site-packages/resource_management/core/environme=
nt.py", line 149, in run
 &nb=
sp;  self.run_action(resource, action)
  Fi=
le "/usr/lib/python2.6/site-packages/resource_management/core/environme=
nt.py", line 115, in run_action
 &nb=
sp;  provider_action()
  Fi=
le "/usr/lib/python2.6/site-packages/resource_management/core/providers=
/system.py", line 241, in action_run
 &nb=
sp;  raise ex
Fail: Exe=
cution of 'export HIVE_CONF_DIR=3D/etc/hive/conf.server ; /usr/hdp/current/h=
ive-client/bin/schematool -initSchema -dbType mysql -userName hive -passWord=
 [PROTECTED]' returned 1. 15/06/01 05:59:21 WARN conf.HiveConf: HiveConf of=
 name hive.optimize.mapjoin.mapreduce does not exist
15/06/01=
 05:59:21 WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist<=
o:p>
15/06/01=
 05:59:21 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersona=
tion does not exist
15/06/01=
 05:59:21 WARN conf.HiveConf: HiveConf of name hive.auto.convert.sortmerge.j=
oin.noconditionaltask does not exist
Metastore=
 connection URL:  jdbc:mysql://vmktest0009.test.analytics.ibmcloud.com/hive?=
createDatabaseIfNotExist=3Dtrue
Metastore=
 Connection Driver :       com.mysql.jdbc.Driver
Metastore=
 connection User:          hive=
org.apach=
e.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
*** schem=
aTool failed ***

 

 

Is there any constraint w.r.t setting up of passwords= in ambari.

 

Please let me know how can I resolve this error so th= at I can automate the same in the deployment.

 

 

With Regards,

Pratik Gadiya

 

DISCLAIMER =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D This e-mail may contain privileged and confidential information which is the= property of Persistent Systems Ltd. It is intended only for the use of the= individual or entity to which it is addressed. If you are not the intended= recipient, you are not authorized to read, retain, copy, print, distribute= or use this message. If you have received this communication in error, plea= se notify the sender and delete all copies of this message. Persistent Syste= ms Ltd. does not accept any liability for virus infected mails.

--_000_CF50A2FCABF81D4E8458A9C95E30342EDFD6D533HJMBX1persisten_--