hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Colin Patrick McCabe (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-10870) Failed to load OpenSSL cipher error logs on systems with old openssl versions
Date Mon, 21 Jul 2014 23:54:39 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-10870?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14069536#comment-14069536
] 

Colin Patrick McCabe commented on HADOOP-10870:
-----------------------------------------------

* Avoid spamming the console with ERROR logs when openssl can't be loaded
* "hadoop checknative" should output something a little more descriptive when openssl can't
be loaded.

Example of "hadoop checknative" output when openssl isn't new enough:
cmccabe@keter:~/hadoop> hadoop checknative
{code}
Native library checking:
hadoop:  true /h-new/lib/native/libhadoop.so.1.0.0
zlib:    true /lib64/libz.so.1
snappy:  true /usr/local/lib64/libsnappy.so.1
lz4:     true revision:99
bzip2:   true /usr/lib64/libbz2.so.1
openssl: false Cannot find AES-CTR support, is your version of Openssl new enough?
{code}

> Failed to load OpenSSL cipher error logs on systems with old openssl versions
> -----------------------------------------------------------------------------
>
>                 Key: HADOOP-10870
>                 URL: https://issues.apache.org/jira/browse/HADOOP-10870
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: security
>    Affects Versions: fs-encryption (HADOOP-10150 and HDFS-6134)
>            Reporter: Stephen Chu
>            Assignee: Colin Patrick McCabe
>         Attachments: HADOOP-10870-fs-enc.001.patch
>
>
> I built Hadoop from fs-encryption branch and deployed Hadoop (without enabling any security
confs) on a Centos 6.4 VM with an old version of openssl.
> {code}
> [root@schu-enc hadoop-common]# rpm -qa | grep openssl
> openssl-1.0.0-27.el6_4.2.x86_64
> openssl-devel-1.0.0-27.el6_4.2.x86_64
> {code}
> When I try to do a simple "hadoop fs -ls", I get
> {code}
> [hdfs@schu-enc hadoop-common]$ hadoop fs -ls
> 2014-07-21 19:35:14,486 ERROR [main] crypto.OpensslCipher (OpensslCipher.java:<clinit>(87))
- Failed to load OpenSSL Cipher.
> java.lang.UnsatisfiedLinkError: Cannot find AES-CTR support, is your version of Openssl
new enough?
> 	at org.apache.hadoop.crypto.OpensslCipher.initIDs(Native Method)
> 	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84)
> 	at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
> 	at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:55)
> 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:591)
> 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:561)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:139)
> 	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2590)
> 	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
> 	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2624)
> 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2606)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
> 	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
> 	at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
> 	at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:228)
> 	at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:211)
> 	at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:194)
> 	at org.apache.hadoop.fs.shell.Command.run(Command.java:155)
> 	at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> 	at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
> 2014-07-21 19:35:14,495 WARN  [main] crypto.CryptoCodec (CryptoCodec.java:getInstance(66))
- Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
> {code}
> It would be an improvment to clean up/shorten this error log.
> hadoop checknative shows the error as well
> {code}
> [hdfs@schu-enc ~]$ hadoop checknative
> 2014-07-21 19:38:38,376 INFO  [main] bzip2.Bzip2Factory (Bzip2Factory.java:isNativeBzip2Loaded(70))
- Successfully loaded & initialized native-bzip2 library system-native
> 2014-07-21 19:38:38,395 INFO  [main] zlib.ZlibFactory (ZlibFactory.java:<clinit>(49))
- Successfully loaded & initialized native-zlib library
> 2014-07-21 19:38:38,411 ERROR [main] crypto.OpensslCipher (OpensslCipher.java:<clinit>(87))
- Failed to load OpenSSL Cipher.
> java.lang.UnsatisfiedLinkError: Cannot find AES-CTR support, is your version of Openssl
new enough?
> 	at org.apache.hadoop.crypto.OpensslCipher.initIDs(Native Method)
> 	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84)
> 	at org.apache.hadoop.util.NativeLibraryChecker.main(NativeLibraryChecker.java:82)
> Native library checking:
> hadoop:  true /home/hdfs/hadoop-3.0.0-SNAPSHOT/lib/native/libhadoop.so.1.0.0
> zlib:    true /lib64/libz.so.1
> snappy:  true /usr/lib64/libsnappy.so.1
> lz4:     true revision:99
> bzip2:   true /lib64/libbz2.so.1
> openssl: false 
> {code}
> Thanks to cmccabe who identified this issue as a bug.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message