ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sebastian Toader (JIRA)" <j...@apache.org>
Subject [jira] [Created] (AMBARI-14232) Kerberization fails if Hive is not installed but Spark is installed without Spark TS
Date Sun, 06 Dec 2015 06:56:11 GMT
Sebastian Toader created AMBARI-14232:
-----------------------------------------

             Summary: Kerberization fails if Hive is not installed but Spark is installed
without Spark TS
                 Key: AMBARI-14232
                 URL: https://issues.apache.org/jira/browse/AMBARI-14232
             Project: Ambari
          Issue Type: Bug
          Components: ambari-server
            Reporter: Sebastian Toader
            Assignee: Sebastian Toader
            Priority: Blocker
             Fix For: 2.2.0


Install spark without Hive. Spark-client installation fails with below error.
{code:title="/var/lib/ambari-agent/data/errors-85.txt"}
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py",
line 59, in <module>
    SparkClient().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 217, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py",
line 35, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py",
line 41, in configure
    setup_spark(env, 'client', action = 'config')
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/setup_spark.py",
line 89, in setup_spark
    key_value_delimiter = " ",
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/properties_file.py",
line 55, in action_create
    mode = self.resource.mode
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
108, in action_create
    self.resource.group, mode=self.resource.mode, cd_access=self.resource.cd_access)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
44, in _ensure_metadata
    _user_entity = pwd.getpwnam(user)
KeyError: 'getpwnam(): name not found: hive'{code}
{code:title=ambari hash}
root@os-u14-qihles-spark-re1-2:~# ambari-server --hash
863d787b3c2f06b9593aa3cca6656f9ee666817d
===========
Version 2.1.3.0 
{code}

The cluster is alive for debugging. 
{code}
172.22.77.226
172.22.77.228
172.22.77.227
***************** INTERNAL HOSTNAMES *******************
os-u14-qihles-spark-re1-2.novalocal
os-u14-qihles-spark-re1-1.novalocal
os-u14-qihles-spark-re1-3.novalocal{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message