hbase-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hadoop QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HBASE-19387) HBase-spark snappy.SnappyError on Arm64
Date Thu, 30 Nov 2017 08:40:00 GMT

    [ https://issues.apache.org/jira/browse/HBASE-19387?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16272359#comment-16272359
] 

Hadoop QA commented on HBASE-19387:
-----------------------------------

| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue}  0m 10s{color} | {color:blue}
Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  0s{color} |
{color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red}  0m  0s{color} | {color:red}
The patch doesn't appear to include any new or modified tests. Please justify why no new tests
are needed for this patch. Also please list what manual steps were performed to verify this
patch. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  4m 15s{color}
| {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  0m 50s{color} |
{color:green} master passed {color} |
| {color:green}+1{color} | {color:green} shadedjars {color} | {color:green}  9m 25s{color}
| {color:green} branch has no errors when building our shaded downstream artifacts. {color}
|
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 10s{color} |
{color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  4m 14s{color}
| {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  0m 51s{color} |
{color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  0m 51s{color} | {color:green}
the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m  0s{color}
| {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} xml {color} | {color:green}  0m  1s{color} | {color:green}
The patch has no ill-formed XML file. {color} |
| {color:green}+1{color} | {color:green} shadedjars {color} | {color:green}  4m 22s{color}
| {color:green} patch has no errors when building our shaded downstream artifacts. {color}
|
| {color:green}+1{color} | {color:green} hadoopcheck {color} | {color:green} 47m 38s{color}
| {color:green} Patch does not cause any errors with Hadoop 2.6.1 2.6.2 2.6.3 2.6.4 2.6.5
2.7.1 2.7.2 2.7.3 2.7.4 or 3.0.0-alpha4. {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 11s{color} |
{color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  4m 27s{color} | {color:green}
hbase-spark in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m  8s{color}
| {color:green} The patch does not generate ASF License warnings. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 67m 25s{color} | {color:black}
{color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=17.05.0-ce Server=17.05.0-ce Image:yetus/hbase:eee3b01 |
| JIRA Issue | HBASE-19387 |
| JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12899961/HBASE-19387.patch
|
| Optional Tests |  asflicense  javac  javadoc  unit  shadedjars  hadoopcheck  xml  compile
 |
| uname | Linux eabbfe481e09 4.4.0-43-generic #63-Ubuntu SMP Wed Oct 12 13:48:03 UTC 2016
x86_64 GNU/Linux |
| Build tool | maven |
| Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-HBASE-Build/component/dev-support/hbase-personality.sh
|
| git revision | master / 9434d52c19 |
| maven | version: Apache Maven 3.5.2 (138edd61fd100ec658bfa2d307c43b76940a5d7d; 2017-10-18T07:58:13Z)
|
| Default Java | 1.8.0_151 |
|  Test Results | https://builds.apache.org/job/PreCommit-HBASE-Build/10139/testReport/ |
| modules | C: hbase-spark U: hbase-spark |
| Console output | https://builds.apache.org/job/PreCommit-HBASE-Build/10139/console |
| Powered by | Apache Yetus 0.6.0   http://yetus.apache.org |


This message was automatically generated.



> HBase-spark snappy.SnappyError on Arm64
> ---------------------------------------
>
>                 Key: HBASE-19387
>                 URL: https://issues.apache.org/jira/browse/HBASE-19387
>             Project: HBase
>          Issue Type: Bug
>          Components: spark, test
>    Affects Versions: 3.0.0
>            Reporter: Yuqi Gu
>            Priority: Minor
>         Attachments: HBASE-19387.patch
>
>
> When running the hbase-spark Unit tests on Arm64, the failures are shown as follows:
>  
> {code:java}
> scalatest-maven-plugin:1.0:test (test) @ hbase-spark ---
> Discovery starting.
> Discovery completed in 2 seconds, 837 milliseconds.
> Run starting. Expected test count is: 79
> HBaseDStreamFunctionsSuite:
> Formatting using clusterid: testClusterID
> - bulkput to test HBase client *** FAILED ***
>   java.lang.reflect.InvocationTargetException:
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
>   at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
>   at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
>   at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
>   at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
>   ...
>   Cause: java.lang.IllegalArgumentException: org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY]
no native library is found for os.name=Linux and os.arch=aarch64
>   at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:156)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
>   at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
>   at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
>   at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
>   ...
>   Cause: org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] no native library
is found for os.name=Linux and os.arch=aarch64
>   at org.xerial.snappy.SnappyLoader.findNativeLibrary(SnappyLoader.java:331)
>   at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:171)
>   at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:152)
>   at org.xerial.snappy.Snappy.<clinit>(Snappy.java:46)
>   at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:154)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
>   ...
> Formatting using clusterid: testClusterID
> PartitionFilterSuite:
> *** RUN ABORTED ***
>   java.lang.reflect.InvocationTargetException:
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
>   at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
>   at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
>   at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
>   at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
>   ...
>   Cause: java.lang.IllegalArgumentException: java.lang.NoClassDefFoundError: Could not
initialize class org.xerial.snappy.Snappy
>   at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:156)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
>   at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
>   at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
>   at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
>   ...
>   Cause: java.lang.NoClassDefFoundError: Could not initialize class org.xerial.snappy.Snappy
>   at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:154)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
>   at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
>   at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
>   at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
>   at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
> {code}
> Root Cause: 
> The spark version in hbase is 1.6.0.
> And the snappy version based on hbase-1.6.0 do not support the Arm64.
> Fix it by upgrading snappy-java to 1.1.4 in hbase-spark.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message