hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chris Nauroth <cnaur...@hortonworks.com>
Subject Re: Running spark 1.2 on Hadoop + Kerberos
Date Fri, 09 Jan 2015 18:53:03 GMT
This kind of error could happen in an HDFS HA deployment if the client is
connecting to a NameNode that happens to be the standby at the moment
instead of the active, and the client is not configured to failover its
NameNode connection to the other one.  This is discussed in the HA
documentation here:

http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-hdfs/HDFSHighAvailabilityWithQJM.html#Configuration_details

In particular, I recommend reading the section about
ConfiguredFailoverProxyProvider and reviewing your configuration files for
this across all nodes.  If this is configured correctly, then the HDFS
client should detect that it has attempted a NameNode connection to a
standby and then failover automatically to the current active.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Thu, Jan 8, 2015 at 8:10 PM, Manoj Samel <manojsameltech@gmail.com>
wrote:

> x-posting to Hadoop
>
> See following error. Hadoop version is 2.3 (CDH 5.0). Name node and
> Resource Manager are in HA configuration
>
> Any thoughts ?
>
> Thanks,
> ---------- Forwarded message ----------
> From: Manoj Samel <manojsameltech@gmail.com>
> Date: Thu, Jan 8, 2015 at 7:33 PM
> Subject: Re: Running spark 1.2 on Hadoop + Kerberos
> To: Marcelo Vanzin <vanzin@cloudera.com>
>
>
> After doing kinit, when the job is submitted using spark-submit, it gives
> following trace. Any idea what is the issue ? HDFS is up and running
>
> 15/01/09 03:21:54 INFO yarn.Client: Will allocate AM container, with 1408
> MB memory including 384 MB overhead
> 15/01/09 03:21:54 INFO yarn.Client: Setting up container launch context
> for our AM
> 15/01/09 03:21:54 INFO yarn.Client: Preparing resources for our AM
> container
> 15/01/09 03:21:55 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
> 112908 for xxx on xxx:8020
> Exception in thread "main"
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException):
> Operation category WRITE is not supported in state standby
> at
> org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1565)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1181)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:6205)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:461)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:905)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>
> On Thu, Jan 8, 2015 at 4:15 PM, Marcelo Vanzin <vanzin@cloudera.com>
> wrote:
>
>> On Thu, Jan 8, 2015 at 4:09 PM, Manoj Samel <manojsameltech@gmail.com>
>> wrote:
>> > Some old communication (Oct 14) says Spark is not certified with
>> Kerberos.
>> > Can someone comment on this aspect ?
>>
>> Spark standalone doesn't support kerberos. Spark running on top of
>> Yarn works fine with kerberos.
>>
>> --
>> Marcelo
>>
>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Mime
View raw message