hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Musty Rehmani <musty_rehm...@yahoo.com.INVALID>
Subject Re: Is it possible to turn on data node encryption without kerberos?
Date Wed, 06 Apr 2016 22:54:38 GMT

Kerberos is used to authenticate user or service principal to grant access to cluster. It
doesn't encrypt data blocks coming in and out of cluster.  
Sent from Yahoo Mail on Android 
 
  On Wed, Apr 6, 2016 at 4:36 PM, Lin Zhao<lin@exabeam.com> wrote:   I've been trying
to secure block data transferred by HDFS. I added below to hdfs-site.xml and core-site xml
to the data node and name node and restart both. 
<property>
  <name>dfs.encrypt.data.transfer</name>
  <value>true</value>
</property><property>
  <name>hadoop.rpc.protection</name>
  <value>privacy</value>
</property> 
When I try to put a file from the hdfs command line shell, the operation fails with "connection
is reset" and I see following from the datanode log:"org.apache.hadoop.hdfs.server.datanode.DataNode:
Failed to read expected encryption handshake from client at /172.31.36.56:48271. Perhaps the
client is running an older version of Hadoop which does not support encryption"
I am able to reproduce this on two different deployments. I was following https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-common/SecureMode.html#Authentication,
but didn't turn on kerberos authentication. No authentication works in my environment. Can
this be the reason the handshake fails? Any help is appreciated.Thanks,Lin Zhao  

Mime
View raw message