ambari-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Robert Levas (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-17921) Spark and Spark2 should use different keytab files to avoid ACL issues
Date Wed, 27 Jul 2016 14:23:20 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-17921?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Robert Levas updated AMBARI-17921:
----------------------------------
    Description: 
If both Spark and Spark2 is installed and each run as a different user, then the ACLs on the
_shared_ keytab files may block access by components in either service to needed keytab files.


For example if Spark is set to run as the user with username {{spark}} and Spark2 is set to
run as the user with username {{spark2}}:
{noformat}
spark-env/spark_user = spark
spark2-env/spark_user = spark2
{noformat}

Then the keytab file for the shared headless principal - spark.headless.keytab - will have
an ACL set that either the spark or the spark2 user can read it (depending on the order the
keytab file is written). 

In this case, the following error will be encountered.... 

{code}
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
line 87, in <module>
    SparkThriftServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
line 54, in start
    spark_service('sparkthriftserver', upgrade_type=upgrade_type, action='start')
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
line 57, in spark_service
    Execute(spark_kinit_cmd, user=params.spark_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
273, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in
_call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in
_call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt /etc/security/keytabs/spark.headless.keytab
spark2rndYgi0ZFOo3FTqIlDWN5GOq@HWQE.HORTONWORKS.COM; ' returned 1. ######## Hortonworks #############
This is MOTD message, added for testing in qe infra
kinit: Generic preauthentication failure while getting initial credentials
{code}

"kinit: Generic preauthentication failure while getting initial credentials" indicates, in
this case, the the user running the Spark service does not have access to the specified keytab
file.

To ensure this does not happen, keytab files for both services should have different file
names. 


  was:
If both Spark and Spark2 is installed and each run as a different user, then the ACLs on the
_shared_ keytab files may block access by components in either service to needed keytab files.


For example if Spark is set to run as the user with username {{spark}} and Spark2 is set to
run as the user with username {{spark2}}:
{noformat}
spark-env/spark_user = spark
spark2-env/spark_user = spar2
{noformat}

Then the keytab file for the shared headless principal - spark.headless.keytab - will have
an ACL set that either the spark or the spark2 user can read it (depending on the order the
keytab file is written). 

In this case, the following error will be encountered.... 

{code}
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
line 87, in <module>
    SparkThriftServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
line 54, in start
    spark_service('sparkthriftserver', upgrade_type=upgrade_type, action='start')
  File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
line 57, in spark_service
    Execute(spark_kinit_cmd, user=params.spark_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
273, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in
_call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in
_call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt /etc/security/keytabs/spark.headless.keytab
spark2rndYgi0ZFOo3FTqIlDWN5GOq@HWQE.HORTONWORKS.COM; ' returned 1. ######## Hortonworks #############
This is MOTD message, added for testing in qe infra
kinit: Generic preauthentication failure while getting initial credentials
{code}

"kinit: Generic preauthentication failure while getting initial credentials" indicates, in
this case, the the user running the Spark service does not have access to the specified keytab
file.

To ensure this does not happen, keytab files for both services should have different file
names. 



> Spark and Spark2 should use different keytab files to avoid ACL issues
> ----------------------------------------------------------------------
>
>                 Key: AMBARI-17921
>                 URL: https://issues.apache.org/jira/browse/AMBARI-17921
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.4.0
>            Reporter: Robert Levas
>            Assignee: Robert Levas
>             Fix For: 2.4.0
>
>
> If both Spark and Spark2 is installed and each run as a different user, then the ACLs
on the _shared_ keytab files may block access by components in either service to needed keytab
files. 
> For example if Spark is set to run as the user with username {{spark}} and Spark2 is
set to run as the user with username {{spark2}}:
> {noformat}
> spark-env/spark_user = spark
> spark2-env/spark_user = spark2
> {noformat}
> Then the keytab file for the shared headless principal - spark.headless.keytab - will
have an ACL set that either the spark or the spark2 user can read it (depending on the order
the keytab file is written). 
> In this case, the following error will be encountered.... 
> {code}
> Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
line 87, in <module>
>     SparkThriftServer().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 280, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
line 54, in start
>     spark_service('sparkthriftserver', upgrade_type=upgrade_type, action='start')
>   File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
line 57, in spark_service
>     Execute(spark_kinit_cmd, user=params.spark_user)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
160, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
124, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 273, in action_run
>     tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71,
in inner
>     result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93,
in checked_call
>     tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141,
in _call_wrapper
>     result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294,
in _call
>     raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt /etc/security/keytabs/spark.headless.keytab
spark2rndYgi0ZFOo3FTqIlDWN5GOq@HWQE.HORTONWORKS.COM; ' returned 1. ######## Hortonworks #############
> This is MOTD message, added for testing in qe infra
> kinit: Generic preauthentication failure while getting initial credentials
> {code}
> "kinit: Generic preauthentication failure while getting initial credentials" indicates,
in this case, the the user running the Spark service does not have access to the specified
keytab file.
> To ensure this does not happen, keytab files for both services should have different
file names. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message