hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Baranau <alex.barano...@gmail.com>
Subject Re: hbase coprocessor unit testing
Date Mon, 16 Apr 2012 14:49:09 GMT
Here's some code that worked for me [1]. You may also find useful to look
at the pom's dependencies [2].

Alex Baranau
------
Sematext :: http://blog.sematext.com/ :: Solr - Lucene - Hadoop - HBase

[1]

From
https://github.com/sematext/HBaseHUT/blob/CPs/src/test/java/com/sematext/hbase/hut/cp/TestHBaseHutCps.java:

 private HBaseTestingUtility testingUtility = new HBaseTestingUtility();
  private HTable hTable;

  @Before
  public void before() throws Exception {
    testingUtility.getConfiguration().setStrings(
            CoprocessorHost.USER_REGION_COPROCESSOR_CONF_KEY,
            HutReadEndpoint.class.getName());
    testingUtility.startMiniCluster();
    hTable = testingUtility.createTable(Bytes.toBytes(TABLE_NAME), SALE_CF);
  }

  @After
  public void after() throws Exception {
    hTable = null;
    testingUtility.shutdownMiniCluster();
    testingUtility = null;
  }

  [... unit-tests that make use of deployed CP ...]

[2]

Full version: https://github.com/sematext/HBaseHUT/blob/CPs/pom.xml

    <hadoop.version>1.0.0</hadoop.version>
    <hbase.version>0.92.1</hbase.version>

[...]

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-core</artifactId>
      <version>${hadoop.version}</version>
      <scope>provided</scope>
      <exclusions>
        <exclusion>
          <groupId>org.codehaus.jackson</groupId>
          <artifactId>jackson-mapper-asl</artifactId>
        </exclusion>
        <exclusion>
          <groupId>org.codehaus.jackson</groupId>
          <artifactId>jackson-core-asl</artifactId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase</artifactId>
      <version>${hbase.version}</version>
      <scope>provided</scope>
    </dependency>

    <!-- Tests dependencies -->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-test</artifactId>
      <version>${hadoop.version}</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase</artifactId>
      <version>${hbase.version}</version>
      <classifier>tests</classifier>
      <scope>test</scope>
    </dependency>

On Mon, Apr 16, 2012 at 9:10 AM, Marcin Cylke <mcl.hbase@touk.pl> wrote:

> Hi
>
> I'm trying to write a unit test for HBase coprocessor. However it seems
> I'm doing something horribly wrong. The code I'm using to test my
> coprocessor class is in the attachment.
>
> As you can see, I'm using HBaseTestingUtility, and running a
> mini-cluster with it. The error I keep getting is:
>
> 2012-04-12 13:00:39,924 [6,1334228432020] WARN  RecoverableZooKeeper
>      :117 - Node /hbase/root-region-server already deleted, and this is
> not a retry
> 2012-04-12 13:00:39,995 [6,1334228432020] INFO  HBaseRPC
>      :240 - Server at localhost/127.0.0.1:45664 could not be reached
> after 1 tries, giving up.
> 2012-04-12 13:00:39,995 [6,1334228432020] WARN  AssignmentManager
>      :1493 - Failed assignment of -ROOT-,,0.70236052 to
> localhost,45664,1334228432229, trying to assign elsewhere instead; retry=0
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed setting
> up proxy interface org.apache.hadoop.hbase.ipc.HRegionInterface to
> localhost/127.0.0.1:45664 after attempts=1
>    at org.apache.hadoop.hbase.ipc.HBaseRPC.waitForProxy(HBaseRPC.java:242)
>    at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1278)
>    at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1235)
>    at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1222)
>    at
>
> org.apache.hadoop.hbase.master.ServerManager.getServerConnection(ServerManager.java:496)
>    at
>
> org.apache.hadoop.hbase.master.ServerManager.sendRegionOpen(ServerManager.java:429)
>    at
>
> org.apache.hadoop.hbase.master.AssignmentManager.assign(AssignmentManager.java:1453)
>    at
>
> org.apache.hadoop.hbase.master.AssignmentManager.assign(AssignmentManager.java:1200)
>    at
>
> org.apache.hadoop.hbase.master.AssignmentManager.assign(AssignmentManager.java:1175)
>    at
>
> org.apache.hadoop.hbase.master.AssignmentManager.assign(AssignmentManager.java:1170)
>    at
>
> org.apache.hadoop.hbase.master.AssignmentManager.assignRoot(AssignmentManager.java:1918)
>    at
> org.apache.hadoop.hbase.master.HMaster.assignRootAndMeta(HMaster.java:557)
>    at
>
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:491)
>    at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:326)
>    at java.lang.Thread.run(Thread.java:662)
> Caused by: java.net.ConnectException: Connection refused
>    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>    at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
>    at
>
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:656)
>    at
>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.setupConnection(HBaseClient.java:328)
>    at
>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.setupIOstreams(HBaseClient.java:362)
>    at
>
> org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1026)
>    at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:878)
>    at
>
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
>    at $Proxy22.getProtocolVersion(Unknown Source)
>    at
>
> org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:183)
>    at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:303)
>    at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:280)
>    at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:332)
>    at org.apache.hadoop.hbase.ipc.HBaseRPC.waitForProxy(HBaseRPC.java:236)
>    ... 14 more
> 2012-04-12 13:00:39,998 [6,1334228432020] WARN  AssignmentManager
>      :1504 - Unable to find a viable location to assign region
> -ROOT-,,0.70236052
> 2012-04-12 13:00:44,138 [.timeoutMonitor] INFO  AssignmentManager
>      :2570 - Regions in transition timed out:  -ROOT-,,0.70236052
> state=OFFLINE, ts=1334228439998, server=null
> 2012-04-12 13:00:44,141 [.timeoutMonitor] INFO  AssignmentManager
>      :2581 - Region has been OFFLINE for too long, reassigning
> -ROOT-,,0.70236052 to a random server
> 2012-04-12 13:00:44,158 [pool-6-thread-1] INFO  HBaseRPC
>      :240 - Server at localhost/127.0.0.1:45664 could not be reached
> after 1 tries, giving up.
>
> This may be related to me using the initHRegion() function - perhaps
> that region cannot connect to the newly created HBase cluster?
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message