spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "MarsXu (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-4503) The history server is not compatible with HDFS HA
Date Thu, 20 Nov 2014 02:40:33 GMT

     [ https://issues.apache.org/jira/browse/SPARK-4503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

MarsXu updated SPARK-4503:
--------------------------
    Description: 
  I use a high availability of HDFS to store the history server data.
  Can be written eventlog to HDFS , but history server cannot be started.
  
  Error log when execute "sbin/start-history-server.sh":
_....
14/11/20 10:25:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls
disabled; users with view permissions: Set(root, ); users with modify permissions: Set(root,
)
14/11/20 10:25:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform...
using builtin-java classes where applicable
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:187)
        at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: appcluster
        at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
...._

When I set <export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://s161.zw.db.d:53310/spark_history">
in spark-evn.sh, can start, but no high availability.

  The config file is as follows:
!### spark-defaults.conf ###
spark.eventLog.dir	hdfs://appcluster/history_server/
spark.yarn.historyServer.address        s161.zw.db.d:18080

!### spark-env.sh ###
export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://appcluster/history_server"

!### core-site.xml ###
    <property>
            <name>fs.defaultFS</name>
                    <value>hdfs://appcluster</value>
    </property>

!### hdfs-site.xml ###
    <property>
        <name>dfs.nameservices</name>
        <value>appcluster</value>
    </property>

    <property>
        <name>dfs.ha.namenodes.appcluster</name>
        <value>nn1,nn2</value>
    </property>

    <property>
        <name>dfs.namenode.rpc-address.appcluster.nn1</name>
        <value>s161.zw.db.d:8020</value>
    </property>

    <property>
        <name>dfs.namenode.rpc-address.appcluster.nn2</name>
        <value>s162.zw.db.d:8020</value>
    </property>

    <property>
        <name>dfs.namenode.servicerpc-address.appcluster.nn1</name>
        <value>s161.zw.db.d:53310</value>
    </property>

    <property>
        <name>dfs.namenode.servicerpc-address.appcluster.nn2</name>
        <value>s162.zw.db.d:53310</value>
    </property>

  was:
  I use a high availability of HDFS to store the history server data.
  Can be written eventlog to HDFS , but history server cannot be started.
  
  Error log when execute "sbin/start-history-server.sh":
....
14/11/20 10:25:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls
disabled; users with view permissions: Set(root, ); users with modify permissions: Set(root,
)
14/11/20 10:25:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform...
using builtin-java classes where applicable
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:187)
        at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: appcluster
        at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
....

When I set <export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://s161.zw.db.d:53310/spark_history">
in spark-evn.sh, can start, but no high availability.

  The config file is as follows:
!### spark-defaults.conf ###
spark.eventLog.dir	hdfs://appcluster/history_server/
spark.yarn.historyServer.address        s161.zw.db.d:18080

!### spark-env.sh ###
export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://appcluster/history_server"

!### core-site.xml ###
    <property>
            <name>fs.defaultFS</name>
                    <value>hdfs://appcluster</value>
    </property>

!### hdfs-site.xml ###
    <property>
        <name>dfs.nameservices</name>
        <value>appcluster</value>
    </property>

    <property>
        <name>dfs.ha.namenodes.appcluster</name>
        <value>nn1,nn2</value>
    </property>

    <property>
        <name>dfs.namenode.rpc-address.appcluster.nn1</name>
        <value>s161.zw.db.d:8020</value>
    </property>

    <property>
        <name>dfs.namenode.rpc-address.appcluster.nn2</name>
        <value>s162.zw.db.d:8020</value>
    </property>

    <property>
        <name>dfs.namenode.servicerpc-address.appcluster.nn1</name>
        <value>s161.zw.db.d:53310</value>
    </property>

    <property>
        <name>dfs.namenode.servicerpc-address.appcluster.nn2</name>
        <value>s162.zw.db.d:53310</value>
    </property>


> The history server is not compatible with HDFS HA
> -------------------------------------------------
>
>                 Key: SPARK-4503
>                 URL: https://issues.apache.org/jira/browse/SPARK-4503
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.1.0
>            Reporter: MarsXu
>            Priority: Minor
>
>   I use a high availability of HDFS to store the history server data.
>   Can be written eventlog to HDFS , but history server cannot be started.
>   
>   Error log when execute "sbin/start-history-server.sh":
> _....
> 14/11/20 10:25:04 INFO SecurityManager: SecurityManager: authentication disabled; ui
acls disabled; users with view permissions: Set(root, ); users with modify permissions: Set(root,
)
> 14/11/20 10:25:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
> Exception in thread "main" java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:187)
>         at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
> Caused by: java.lang.IllegalArgumentException: java.net.UnknownHostException: appcluster
>         at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
> ...._
> When I set <export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://s161.zw.db.d:53310/spark_history">
in spark-evn.sh, can start, but no high availability.
>   The config file is as follows:
> !### spark-defaults.conf ###
> spark.eventLog.dir	hdfs://appcluster/history_server/
> spark.yarn.historyServer.address        s161.zw.db.d:18080
> !### spark-env.sh ###
> export SPARK_HISTORY_OPTS="-Dspark.history.fs.logDirectory=hdfs://appcluster/history_server"
> !### core-site.xml ###
>     <property>
>             <name>fs.defaultFS</name>
>                     <value>hdfs://appcluster</value>
>     </property>
> !### hdfs-site.xml ###
>     <property>
>         <name>dfs.nameservices</name>
>         <value>appcluster</value>
>     </property>
>     <property>
>         <name>dfs.ha.namenodes.appcluster</name>
>         <value>nn1,nn2</value>
>     </property>
>     <property>
>         <name>dfs.namenode.rpc-address.appcluster.nn1</name>
>         <value>s161.zw.db.d:8020</value>
>     </property>
>     <property>
>         <name>dfs.namenode.rpc-address.appcluster.nn2</name>
>         <value>s162.zw.db.d:8020</value>
>     </property>
>     <property>
>         <name>dfs.namenode.servicerpc-address.appcluster.nn1</name>
>         <value>s161.zw.db.d:53310</value>
>     </property>
>     <property>
>         <name>dfs.namenode.servicerpc-address.appcluster.nn2</name>
>         <value>s162.zw.db.d:53310</value>
>     </property>



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message