hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stack <st...@duboce.net>
Subject Re: Error : java.lang.UnsatisfiedLinkError: failed to load the required native library for netty
Date Wed, 04 Oct 2017 19:59:46 GMT
Seems like eclipse runs unit tests serially in same jvm [1]. Do you have to
set it in your eclipse.ini file as an option? [2] (That'd be a pain).
St.Ack


1.
https://stackoverflow.com/questions/12933565/junit-fork-mode-in-java-classes
2. http://wiki.eclipse.org/Eclipse.ini

On Wed, Oct 4, 2017 at 9:31 AM, ramkrishna vasudevan <
ramkrishna.s.vasudevan@gmail.com> wrote:

> I too tried just now but does not seem to work.
>
> >>Sorry for the irritation.
> No problem Stack.
>
> Regards
> Ram
>
> On Wed, Oct 4, 2017 at 9:58 PM, Amit Kabra <amitkabraiiit@gmail.com>
> wrote:
>
> > I tried that , didn't work for me.
> >
> > ramkrishna.s.vasudevan@gmail.com - can you try and see if that works for
> > you ?
> >
> >
> > Amit.
> >
> > On Wed, Oct 4, 2017 at 8:34 PM, Stack <stack@duboce.net> wrote:
> >
> > > On Tue, Oct 3, 2017 at 10:18 PM, ramkrishna vasudevan <
> > > ramkrishna.s.vasudevan@gmail.com> wrote:
> > >
> > > > Hi Stack
> > > > I just took an update and tried running some test cases with the
> > eclipse
> > > > IDE working on linux
> > > >
> > > >
> > > If you bring up the test run configuration window in eclipse and add in
> > the
> > > command-line argument panel
> > >
> > >
> > > -Dorg.apache.hadoop.hbase.shaded.io.netty.packagePrefix=
> > > org.apache.hadoop.hbase.shaded.
> > >
> > > ... does it pass?
> > >
> > > If so, I'll figure how to get it into eclipse context.
> > >
> > > Sorry for the irritation.
> > >
> > > St.Ack
> > >
> > >
> > >
> > >
> > >
> > >
> > > > java.io.IOException: Shutting down
> > > > at org.apache.hadoop.hbase.MiniHBaseCluster.init(
> > > > MiniHBaseCluster.java:232)
> > > > at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(
> > > > MiniHBaseCluster.java:94)
> > > > at
> > > > org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(
> > > > HBaseTestingUtility.java:1128)
> > > > at
> > > > org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(
> > > > HBaseTestingUtility.java:1082)
> > > > at
> > > > org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(
> > > > HBaseTestingUtility.java:953)
> > > > at
> > > > org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(
> > > > HBaseTestingUtility.java:935)
> > > > at
> > > > org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(
> > > > HBaseTestingUtility.java:917)
> > > > at
> > > > org.apache.hadoop.hbase.regionserver.TestRegionReplicasWithModifyTa
> > > > ble.before(TestRegionReplicasWithModifyTable.java:65)
> > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > > at
> > > > sun.reflect.NativeMethodAccessorImpl.invoke(
> > > NativeMethodAccessorImpl.java:
> > > > 62)
> > > > at
> > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > > > DelegatingMethodAccessorImpl.java:43)
> > > > ....
> > > >
> > > > Caused by: java.lang.UnsatisfiedLinkError: failed to load the
> required
> > > > native library
> > > > at
> > > > org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> > > > Epoll.ensureAvailability(Epoll.java:78)
> > > > at
> > > > org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> > > > EpollEventLoopGroup.<clinit>(EpollEventLoopGroup.java:38)
> > > > at
> > > > org.apache.hadoop.hbase.util.NettyEventLoopGroupConfig.<init>(
> > > > NettyEventLoopGroupConfig.java:61)
> > > > at
> > > > org.apache.hadoop.hbase.regionserver.HRegionServer.<
> > > > init>(HRegionServer.java:552)
> > > > at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:475)
> > > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> > > > at
> > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > > > NativeConstructorAccessorImpl.java:62)
> > > > at
> > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > > > DelegatingConstructorAccessorImpl.java:45)
> > > > at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> > > > at
> > > > org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(
> > > > JVMClusterUtil.java:140)
> > > > ... 26 more
> > > > Caused by: java.lang.UnsatisfiedLinkError:
> > > > org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> > > > NativeStaticallyReferencedJniMethods.epollin()I
> > > > at
> > > > org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> > > > NativeStaticallyReferencedJniMethods.epollin(Native
> > > > Method)
> > > > at
> > > > org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> > > > Native.<clinit>(Native.java:66)
> > > > at
> > > > org.apache.hadoop.hbase.shaded.io.netty.channel.epoll.
> > > > Epoll.<clinit>(Epoll.java:33)
> > > > ... 35 more
> > > >
> > > > This also appears to come out of netty.
> > > > When I run the same test case using 'mvn test -Dtest' command then it
> > > works
> > > > fine.
> > > >
> > > > Regards
> > > > Ram
> > > >
> > > > On Wed, Oct 4, 2017 at 12:42 AM, Stack <stack@duboce.net> wrote:
> > > >
> > > > > It just came in... Looks better on jenkins than my linux machine
> (4m
> > > > > total):
> > > > > https://issues.apache.org/jira/browse/HBASE-18606?
> > > > > focusedCommentId=16190154&page=com.atlassian.jira.
> > > > > plugin.system.issuetabpanels:comment-tabpanel#comment-16190154
> > > > >
> > > > > Let me try pushing it.
> > > > >
> > > > > St.Ack
> > > > >
> > > > >
> > > > >
> > > > > On Tue, Oct 3, 2017 at 12:00 PM, Stack <stack@duboce.net> wrote:
> > > > >
> > > > > > Yeah (12x the processors, 8x RAM, 12x the disks...). Lets see
how
> > > patch
> > > > > > does one jenkins... (HBASE-18606).
> > > > > > S
> > > > > >
> > > > > > On Tue, Oct 3, 2017 at 11:48 AM, Sean Busbey <busbey@apache.org>
> > > > wrote:
> > > > > >
> > > > > >> the linux box roughly as capable as your laptop?
> > > > > >>
> > > > > >> On Tue, Oct 3, 2017 at 12:51 PM, Stack <stack@duboce.net>
> wrote:
> > > > > >> > Tests passed eventually for me:
> > > > > >> >
> > > > > >> > real 47m6.498s
> > > > > >> > user 5m29.671s
> > > > > >> > sys 0m41.885s
> > > > > >> >
> > > > > >> > ... which is a big diff from macosx run.
> > > > > >> >
> > > > > >> > Need to look into this.
> > > > > >> >
> > > > > >> > St.Ack
> > > > > >> >
> > > > > >> >
> > > > > >> > On Tue, Oct 3, 2017 at 10:03 AM, Stack <stack@duboce.net>
> > wrote:
> > > > > >> >
> > > > > >> >> The below gets us further but now I see that the
spark tests
> > > take a
> > > > > >> really
> > > > > >> >> long time to run on linux but complete promptly
on macosx (2m
> > > 55s).
> > > > > >> >> Looking....
> > > > > >> >>
> > > > > >> >> St.Ack
> > > > > >> >>
> > > > > >> >> On Tue, Oct 3, 2017 at 9:13 AM, Stack <stack@duboce.net>
> > wrote:
> > > > > >> >>
> > > > > >> >>> This seems to work for me. Does it work for
you?
> > > > > >> >>>
> > > > > >> >>>
> > > > > >> >>> diff --git a/hbase-spark/pom.xml b/hbase-spark/pom.xml
> > > > > >> >>> index 594aa2a..6d191e3 100644
> > > > > >> >>> --- a/hbase-spark/pom.xml
> > > > > >> >>> +++ b/hbase-spark/pom.xml
> > > > > >> >>> @@ -568,6 +568,9 @@
> > > > > >> >>>            <junitxml>.</junitxml>
> > > > > >> >>>            <filereports>WDF TestSuite.txt</filereports>
> > > > > >> >>>            <parallel>false</parallel>
> > > > > >> >>> +          <systemProperties>
> > > > > >> >>> +            <org.apache.hadoop.hbase.shade
> > > > > >> d.io.netty.packagePrefix>org.
> > > > > >> >>> apache.hadoop.hbase.shaded.</org.apache.hadoop.hbase.
> > > > > >> >>> shaded.io.netty.packagePrefix>
> > > > > >> >>> +          </systemProperties>
> > > > > >> >>>          </configuration>
> > > > > >> >>>          <executions>
> > > > > >> >>>            <execution>
> > > > > >> >>>
> > > > > >> >>> St.Ack
> > > > > >> >>>
> > > > > >> >>> On Tue, Oct 3, 2017 at 8:45 AM, Amit Kabra
<
> > > > amitkabraiiit@gmail.com
> > > > > >
> > > > > >> >>> wrote:
> > > > > >> >>>
> > > > > >> >>>> Thanks Stack / Sean Busbey for replying.
> > > > > >> >>>>
> > > > > >> >>>> OS : Ubuntu 16.04.2 , 64 bit.
> > > > > >> >>>> Eclipse : Version: Neon.3 Release (4.6.3)
> > > > > >> >>>> HBase branch : branch-2
> > > > > >> >>>> Command line test to reproduce : mvn clean
package
> > > > > >> >>>> -Dtest=TestIncrementalBackup
> > > > > >> >>>> Reproduce from eclipse , right click on
> TestIncBackupRestore
> > > and
> > > > > run
> > > > > >> as
> > > > > >> >>>> junit from test class TestIncrementalBackup.
> > > > > >> >>>> No I am not embedding hbase in my application.
I have just
> > > > checked
> > > > > >> out
> > > > > >> >>>> hbase , switched to branch-2 and run the
unit test from
> > command
> > > > > line
> > > > > >> or
> > > > > >> >>>> from eclipse. Failing with same error in
both cases.
> > > > > >> >>>> Yes the trailing period is also present.
> > > > > >> >>>>
> > > > > >> >>>> Thanks,
> > > > > >> >>>> Amit Kabra.
> > > > > >> >>>>
> > > > > >> >>>>
> > > > > >> >>>>
> > > > > >> >>>>
> > > > > >> >>>>
> > > > > >> >>>> On Tue, Oct 3, 2017 at 8:53 PM, Stack <stack@duboce.net>
> > > wrote:
> > > > > >> >>>>
> > > > > >> >>>> > Thank you for the detail.
> > > > > >> >>>> >
> > > > > >> >>>> > Pardon the questions below asking
for yet more detail. I
> am
> > > > > unable
> > > > > >> to
> > > > > >> >>>> > reproduce locally or on another os
(though we see this
> > issue
> > > up
> > > > > on
> > > > > >> our
> > > > > >> >>>> > build box).
> > > > > >> >>>> >
> > > > > >> >>>> > What is your OS when you see the below?
> > > > > >> >>>> >
> > > > > >> >>>> > On Tue, Oct 3, 2017 at 2:06 AM, Amit
Kabra <
> > > > > >> amitkabraiiit@gmail.com>
> > > > > >> >>>> > wrote:
> > > > > >> >>>> >
> > > > > >> >>>> > > Hello,
> > > > > >> >>>> > >
> > > > > >> >>>> > > I am using "branch-2" branch
of hbase, when I run unit
> > > test I
> > > > > get
> > > > > >> >>>> > following
> > > > > >> >>>> > > error for netty "java.lang.UnsatisfiedLinkError:
> failed
> > to
> > > > > load
> > > > > >> the
> > > > > >> >>>> > > required native library"
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > > This is running a unit test in
your eclipse
> environment?
> > > > > >> >>>> >
> > > > > >> >>>> > You are trying to run an hbase-spark
unit test when you
> see
> > > the
> > > > > >> above?
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> > > *I already have following set
in
> "maven-surefire-plugin"
> > in
> > > > > >> pom.xml
> > > > > >> >>>> as
> > > > > >> >>>> > > per http://hbase.apache.org/book.html#thirdparty
> > > > > >> >>>> > > <http://hbase.apache.org/book.html#thirdparty>*
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> >
> > > > > >> >>>> > Are you embedding hbase into your
application?
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> > >             <systemPropertyVariables>
> > > > > >> >>>> > >                 <!--
> > > > > >> >>>> > >               <test.build.classes>${test.bu
> > > > > >> ild.classes}</test.build.
> > > > > >> >>>> > classe
> > > > > >> >>>> > > s>
> > > > > >> >>>> > >                 -->
> > > > > >> >>>> > >               <!--For shaded
netty, to find the
> relocated
> > > > .so.
> > > > > >> >>>> > >                    Trick from
> > > > > >> >>>> > >                 https://stackoverflow.com/que
> > > > > >> stions/33825743/rename-
> > > > > >> >>>> > > files-inside-a-jar-using-some-maven-plugin
> > > > > >> >>>> > > <https://stackoverflow.com/questions/33825743/rename-
> > > > > >> >>>> > files-inside-a-jar-using-some-maven-plugin>
> > > > > >> >>>> > >
> > > > > >> >>>> > >                 The netty jar
has a .so in it. Shading
> > > > requires
> > > > > >> >>>> rename of
> > > > > >> >>>> > > the .so and then passing a system
> > > > > >> >>>> > >                 property so netty
finds the renamed .so
> > and
> > > > > >> >>>> associates it
> > > > > >> >>>> > > w/ the relocated netty files.
> > > > > >> >>>> > >
> > > > > >> >>>> > >                 The relocated
netty is in
> > hbase-thirdparty
> > > > > >> >>>> dependency.
> > > > > >> >>>> > Just
> > > > > >> >>>> > > set this propery globally rather
> > > > > >> >>>> > >                 than per module.
> > > > > >> >>>> > >                -->
> > > > > >> >>>> > >               <org.apache.hadoop.hbase.shad
> > > > > >> >>>> ed.io.netty.packagePrefix>
> > > > > >> >>>> > > org.apache.hadoop.hbase.shaded.</org.apache.hadoop.
> > > > > >> >>>> > hbase.shaded.io.netty.
> > > > > >> >>>> > > packagePrefix>
> > > > > >> >>>> > >             </systemPropertyVariables>
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > > *And I see in the code as per
HBASE-18271, all io.netty
> > is
> > > > > >> already
> > > > > >> >>>> > replaced
> > > > > >> >>>> > > with org.apache.hadoop.hbase.shaded.io.netty*
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > The trailing period is also present?
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> > >
> > > > > >> >>>> > > If I run a test from eclipse
, I see the error
> > immediately
> > > > and
> > > > > my
> > > > > >> >>>> test
> > > > > >> >>>> > > doesn't run, but when I run from
command line , the
> test
> > > runs
> > > > > >> but I
> > > > > >> >>>> get
> > > > > >> >>>> > the
> > > > > >> >>>> > > error at the end when the mvn
command finishes.
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > > Is it any eclipse test?
> > > > > >> >>>> >
> > > > > >> >>>> > Thank you. Let me try and fix this
this morning.
> > > > > >> >>>> >
> > > > > >> >>>> > S
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> >
> > > > > >> >>>> > > *Here is the complete error output.*
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > >
> > > > > >> >>>> > > [INFO]
> > > > > >> >>>> > > [INFO] --- maven-surefire-plugin:2.19.1:test
> > > (default-test)
> > > > @
> > > > > >> >>>> > hbase-spark
> > > > > >> >>>> > > ---
> > > > > >> >>>> > > [INFO]
> > > > > >> >>>> > > [INFO] --- scalatest-maven-plugin:1.0:test
(test) @
> > > > > hbase-spark
> > > > > >> ---
> > > > > >> >>>> > > Discovery starting.
> > > > > >> >>>> > > Discovery completed in 1 second,
558 milliseconds.
> > > > > >> >>>> > > Run starting. Expected test count
is: 79
> > > > > >> >>>> > > HBaseDStreamFunctionsSuite:
> > > > > >> >>>> > > Formatting using clusterid: testClusterID
> > > > > >> >>>> > > *** RUN ABORTED ***
> > > > > >> >>>> > >   java.io.IOException: Shutting
down
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.init(
> > > > > >> >>>> > > MiniHBaseCluster.java:232)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(
> > > > > >> >>>> > > MiniHBaseCluster.java:94)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniHBaseCl
> > > > > >> >>>> uster(
> > > > > >> >>>> > > HBaseTestingUtility.java:1124)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:1078)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:949)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:943)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:872)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.spark.
> > > > > >> HBaseDStreamFunctionsSuite.bef
> > > > > >> >>>> oreAll(
> > > > > >> >>>> > > HBaseDStreamFunctionsSuite.scala:41)
> > > > > >> >>>> > >   at org.scalatest.BeforeAndAfterAl
> > > > > >> l$class.beforeAll(BeforeAndAft
> > > > > >> >>>> erAll.
> > > > > >> >>>> > > scala:187)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.spark.
> > > > > >> HBaseDStreamFunctionsSuite.bef
> > > > > >> >>>> oreAll(
> > > > > >> >>>> > > HBaseDStreamFunctionsSuite.scala:30)
> > > > > >> >>>> > >   ...
> > > > > >> >>>> > >   Cause: java.lang.RuntimeException:
Failed
> construction
> > of
> > > > > >> Master:
> > > > > >> >>>> class
> > > > > >> >>>> > > org.apache.hadoop.hbase.master.HMasterorg.apache.
> > > > > >> >>>> > > hadoop.hbase.shaded.io.netty.channel.epoll.
> > > > > >> >>>> > NativeStaticallyReferencedJniM
> > > > > >> >>>> > > ethods.epollin()I
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.util.J
> > > > > >> VMClusterUtil.createMasterThread(
> > > > > >> >>>> > > JVMClusterUtil.java:145)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > LocalHBaseCluster.addMaster(
> > > > > >> >>>> > > LocalHBaseCluster.java:217)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(
> > > > > >> >>>> > > LocalHBaseCluster.java:152)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.init(
> > > > > >> >>>> > > MiniHBaseCluster.java:214)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(
> > > > > >> >>>> > > MiniHBaseCluster.java:94)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniHBaseCl
> > > > > >> >>>> uster(
> > > > > >> >>>> > > HBaseTestingUtility.java:1124)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:1078)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:949)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:943)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.HBaseT
> > > > > >> estingUtility.startMiniCluster(
> > > > > >> >>>> > > HBaseTestingUtility.java:872)
> > > > > >> >>>> > >   ...
> > > > > >> >>>> > >   Cause: java.lang.UnsatisfiedLinkError:
failed to
> load
> > > the
> > > > > >> required
> > > > > >> >>>> > > native
> > > > > >> >>>> > > library
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > shaded.io.netty.channel.epoll.
> > > > > >> >>>> > > Epoll.ensureAvailability(Epoll.java:78)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > shaded.io.netty.channel.epoll.
> > > > > >> >>>> > > EpollEventLoopGroup.<clinit>(
> > EpollEventLoopGroup.java:38)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.util.N
> > > > > >> ettyEventLoopGroupConfig.<init>(
> > > > > >> >>>> > > NettyEventLoopGroupConfig.java:61)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > regionserver.HRegionServer.<
> > > > > >> >>>> > > init>(HRegionServer.java:552)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > master.HMaster.<init>(HMaster.
> > > > > java:
> > > > > >> 475)
> > > > > >> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.
> > > newInstance0(
> > > > > >> Native
> > > > > >> >>>> > Method)
> > > > > >> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.
> > > newInstance(
> > > > > >> >>>> > > NativeConstructorAccessorImpl.java:62)
> > > > > >> >>>> > >   at sun.reflect.DelegatingConstructorAccessorI
> > > > > mpl.newInstance(
> > > > > >> >>>> > > DelegatingConstructorAccessorImpl.java:45)
> > > > > >> >>>> > >   at java.lang.reflect.Constructor.
> > > > > >> newInstance(Constructor.java:423)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.util.J
> > > > > >> VMClusterUtil.createMasterThread(
> > > > > >> >>>> > > JVMClusterUtil.java:140)
> > > > > >> >>>> > >   ...
> > > > > >> >>>> > >   Cause: java.lang.UnsatisfiedLinkError:
> > > > > >> org.apache.hadoop.hbase.
> > > > > >> >>>> > > shaded.io.netty.channel.epoll.
> > > NativeStaticallyReferencedJniM
> > > > > >> >>>> > > ethods.epollin()I
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > shaded.io.netty.channel.epoll.
> > > > > >> >>>> > > NativeStaticallyReferencedJniMethods.epollin(Native
> > > Method)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > shaded.io.netty.channel.epoll.
> > > > > >> >>>> > > Native.<clinit>(Native.java:66)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > shaded.io.netty.channel.epoll.
> > > > > >> >>>> > > Epoll.<clinit>(Epoll.java:33)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > shaded.io.netty.channel.epoll.
> > > > > >> >>>> > > EpollEventLoopGroup.<clinit>(
> > EpollEventLoopGroup.java:38)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.util.N
> > > > > >> ettyEventLoopGroupConfig.<init>(
> > > > > >> >>>> > > NettyEventLoopGroupConfig.java:61)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > regionserver.HRegionServer.<
> > > > > >> >>>> > > init>(HRegionServer.java:552)
> > > > > >> >>>> > >   at org.apache.hadoop.hbase.
> > > master.HMaster.<init>(HMaster.
> > > > > java:
> > > > > >> 475)
> > > > > >> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.
> > > newInstance0(
> > > > > >> Native
> > > > > >> >>>> > Method)
> > > > > >> >>>> > >   at sun.reflect.NativeConstructorAccessorImpl.
> > > newInstance(
> > > > > >> >>>> > > NativeConstructorAccessorImpl.java:62)
> > > > > >> >>>> > >   at sun.reflect.DelegatingConstructorAccessorI
> > > > > mpl.newInstance(
> > > > > >> >>>> > > DelegatingConstructorAccessorImpl.java:45)
> > > > > >> >>>> > >   ...
> > > > > >> >>>> > > [INFO] ------------------------------
> > > > > >> ------------------------------
> > > > > >> >>>> > > ------------
> > > > > >> >>>> > > [INFO] Reactor Summary:
> > > > > >> >>>> > > [INFO]
> > > > > >> >>>> > > [INFO] Apache HBase ..............................
> > > .........
> > > > > >> SUCCESS
> > > > > >> >>>> [
> > > > > >> >>>> > > 1.575 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Checkstyle
> > ..........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 0.317 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Annotations
> > .........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 0.537 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Build Configuration
> > .................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 0.053 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Shaded
Protocol
> > .....................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 15.410 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Common
> > ..............................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 4.603 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Metrics
API
> > .........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 1.213 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Hadoop
Compatibility
> > ................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 0.985 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Metrics
Implementation
> > ..............
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 0.863 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Hadoop
Two Compatibility
> > ............
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 1.750 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Protocol
> > ............................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 4.880 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Client
> > ..............................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 5.233 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Replication
> > .........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 1.040 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Prefix
Tree
> > .........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 1.121 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Procedure
> > ...........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 1.084 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Resource
Bundle
> > .....................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 0.092 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Server
> > ..............................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 19.849 s]
> > > > > >> >>>> > > [INFO] Apache HBase - MapReduce
> > ...........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 4.221 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Testing
Util
> > ........................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 3.273 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Thrift
> > ..............................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 5.519 s]
> > > > > >> >>>> > > [INFO] Apache HBase - RSGroup
> > .............................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 3.408 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Shell
> > ..............................
> > > .
> > > > > >> SUCCESS
> > > > > >> >>>> [
> > > > > >> >>>> > > 3.859 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Coprocessor
Endpoint
> > ................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 4.038 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Backup
> > ..............................
> > > > > >> SUCCESS
> > > > > >> >>>> > [01:13
> > > > > >> >>>> > > min]
> > > > > >> >>>> > > [INFO] Apache HBase - Integration
Tests
> > ...................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 4.229 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Examples
> > ............................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 3.471 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Rest
> ..............................
> > > ..
> > > > > >> SUCCESS
> > > > > >> >>>> [
> > > > > >> >>>> > > 4.448 s]
> > > > > >> >>>> > > [INFO] Apache HBase - External
Block Cache
> > ................
> > > > > >> SUCCESS [
> > > > > >> >>>> > > 2.040 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Spark
> > ..............................
> > > .
> > > > > >> FAILURE
> > > > > >> >>>> [
> > > > > >> >>>> > > 32.833 s]
> > > > > >> >>>> > > [INFO] Apache HBase - Spark Integration
Tests
> > .............
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Assembly
> > ............................
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Shaded
> > ..............................
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Shaded
- Client
> > .....................
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Shaded
- MapReduce
> > ..................
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase Shaded Packaging
Invariants
> > ...........
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Archetypes
> > ..........................
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Exemplar
for hbase-client
> > archetype .
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Exemplar
for hbase-shaded-client
> > > > > archetype
> > > > > >> >>>> SKIPPED
> > > > > >> >>>> > > [INFO] Apache HBase - Archetype
builder
> > ...................
> > > > > >> SKIPPED
> > > > > >> >>>> > > [INFO] ------------------------------
> > > > > >> ------------------------------
> > > > > >> >>>> > > ------------
> > > > > >> >>>> > > [INFO] BUILD FAILURE
> > > > > >> >>>> > > [INFO] ------------------------------
> > > > > >> ------------------------------
> > > > > >> >>>> > > ------------
> > > > > >> >>>> > > [INFO] Total time: 03:26 min
> > > > > >> >>>> > > [INFO] Finished at: 2017-09-27T19:34:35+05:30
> > > > > >> >>>> > > [INFO] Final Memory: 345M/6055M
> > > > > >> >>>> > > [INFO] ------------------------------
> > > > > >> ------------------------------
> > > > > >> >>>> > > ------------
> > > > > >> >>>> > > [ERROR] Failed to execute goal
> > > org.scalatest:scalatest-maven-
> > > > > >> >>>> > > plugin:1.0:test
> > > > > >> >>>> > > (test) on project hbase-spark:
There are test failures
> ->
> > > > [Help
> > > > > >> 1]
> > > > > >> >>>> > > [ERROR]
> > > > > >> >>>> > > [ERROR] To see the full stack
trace of the errors,
> re-run
> > > > Maven
> > > > > >> with
> > > > > >> >>>> the
> > > > > >> >>>> > -e
> > > > > >> >>>> > > switch.
> > > > > >> >>>> > > [ERROR] Re-run Maven using the
-X switch to enable full
> > > debug
> > > > > >> >>>> logging.
> > > > > >> >>>> > > [ERROR]
> > > > > >> >>>> > > [ERROR] For more information
about the errors and
> > possible
> > > > > >> solutions,
> > > > > >> >>>> > > please read the following articles:
> > > > > >> >>>> > > [ERROR] [Help 1] http://cwiki.apache.org/conflu
> > > > > >> ence/display/MAVEN/
> > > > > >> >>>> > > MojoFailureException
> > > > > >> >>>> > > [ERROR]
> > > > > >> >>>> > > [ERROR] After correcting the
problems, you can resume
> the
> > > > build
> > > > > >> with
> > > > > >> >>>> the
> > > > > >> >>>> > > command
> > > > > >> >>>> > > [ERROR]   mvn <goals> -rf
:hbase-spark
> > > > > >> >>>> > >
> > > > > >> >>>> >
> > > > > >> >>>>
> > > > > >> >>>
> > > > > >> >>>
> > > > > >> >>
> > > > > >>
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message