hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stack <st...@duboce.net>
Subject Re: Accessing HBase using java native api from weblogic
Date Fri, 19 Jul 2013 19:18:38 GMT
You see anything in zk logs at the time of your connectionloss?
192.168.56.101:2181 is where a zk ensemble member resides (and is up and
running)?

St.Ack


On Fri, Jul 19, 2013 at 11:33 AM, rjoshi <rinku24@yahoo.com> wrote:

> Hello,
>
>   I have a java native api to access HBase and it works fine as a
> standalone
> program. But when I deploy same jar file as part of JAX-RS service, it's
> not
> able to get zookeeper.
>
> I have given below both logs when accessed (doesn't work) from weblogic and
> accessed (works fine) as a standalone java program using same jar.
>
> Any help would be appreciated.
>
>
> Below is the log when accessed within WebLogic 10.3.5 and it doesn't work.
>
> <pre>
> 2013-07-19 14:15:30 INFO  class:68 -
> *************************************************
> 2013-07-19 14:15:30 INFO  class:69 - HBase configuration info ::
> zookeeperQuorum -->192.168.56.101, zookeeperClientPort -->2181, hbaseMaster
> -->192.168.56.101:60000
> 2013-07-19 14:15:30 INFO  class:72 -
> *************************************************
> 2013-07-19 14:15:30 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with
> annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
> about=, value=[Rate of successful kerberos logins and latency
> (milliseconds)], type=DEFAULT, always=false, sampleName=Ops)
> 2013-07-19 14:15:30 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with
> annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
> about=, value=[Rate of failed kerberos logins and latency (milliseconds)],
> type=DEFAULT, always=false, sampleName=Ops)
> 2013-07-19 14:15:30 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
> group
> related metrics
> 2013-07-19 14:15:30 DEBUG Groups:180 -  Creating new Groups object
> 2013-07-19 14:15:30 DEBUG NativeCodeLoader:46 - Trying to load the
> custom-built native-hadoop library...
> 2013-07-19 14:15:30 DEBUG NativeCodeLoader:55 - Failed to load
> native-hadoop
> with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
> 2013-07-19 14:15:30 DEBUG NativeCodeLoader:56 -
>
> java.library.path=C:\Java\JROCKI~1.0\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\Oracle\MIDDLE~3\patch_wls1035\profiles\default\native;C:\Oracle\MIDDLE~3\patch_ocp360\profiles\default\native;C:\Oracle\MIDDLE~3\patch_jdev1111\profiles\default\native;C:\Oracle\MIDDLE~3\WLSERV~1.3\server\native\win\32;C:\Oracle\MIDDLE~3\WLSERV~1.3\server\bin;C:\Oracle\MIDDLE~3\modules\ORGAPA~1.1\bin;C:\Java\JROCKI~1.0\jre\bin;C:\Java\JROCKI~1.0\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program
> Files (x86)\Microsoft Application Virtualization
> Client;C:\Java\jrockit-jdk1.6.0_37-R28.2.5-4.1.0;C:\Program
>
> Files\TortoiseSVN\bin;C:\Oracl;C:\Users\user123\AppData\Local\Enthought\Canopy32\User\Scripts;C:\Softwares\npp.6.3.2.bin\unicode\notepad++.exe;C:\Python\Python27;C:\Softwares\apache-maven-3.0.5\bin;C:\Softwares\depot_tools;C:\Softwares\apache-ant-1.9.1\bin;;C:\Oracle\MIDDLE~3\WLSERV~1.3\server\native\win\32\oci920_8;.
> 2013-07-19 14:15:30 WARN  NativeCodeLoader:62 - Unable to load
> native-hadoop
> library for your platform... using builtin-java classes where applicable
> 2013-07-19 14:15:30 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
> Falling
> back to shell based
> 2013-07-19 14:15:30 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
> 2013-07-19 14:15:30 DEBUG Groups:66 - Group mapping
> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
> cacheTimeout=300000
> 2013-07-19 14:15:30 DEBUG UserGroupInformation:175 - hadoop login
> 2013-07-19 14:15:30 DEBUG UserGroupInformation:124 - hadoop login commit
> 2013-07-19 14:15:30 DEBUG UserGroupInformation:154 - using local
> user:NTUserPrincipal: user123
> 2013-07-19 14:15:30 DEBUG UserGroupInformation:697 - UGI loginUser:user123
> (auth:SIMPLE)
> 2013-07-19 14:15:30 DEBUG ZKUtil:120 - hconnection opening connection to
> ZooKeeper with ensemble (192.168.56.101:2181)
> 2013-07-19 14:15:30 INFO  RecoverableZooKeeper:104 - The identifier of this
> process is 55380@WIN75CB245190F
> 2013-07-19 14:15:30 WARN  RecoverableZooKeeper:219 - Possibly transient
> ZooKeeper exception:
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
> 2013-07-19 14:15:30 INFO  RetryCounter:53 - Sleeping 2000ms before retry
> #1...
> 2013-07-19 14:15:32 WARN  RecoverableZooKeeper:219 - Possibly transient
> ZooKeeper exception:
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
> 2013-07-19 14:15:32 INFO  RetryCounter:53 - Sleeping 4000ms before retry
> #2...
> 2013-07-19 14:15:37 WARN  RecoverableZooKeeper:219 - Possibly transient
> ZooKeeper exception:
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
> 2013-07-19 14:15:37 INFO  RetryCounter:53 - Sleeping 8000ms before retry
> #3...
> 2013-07-19 14:15:45 WARN  RecoverableZooKeeper:219 - Possibly transient
> ZooKeeper exception:
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
> 2013-07-19 14:15:45 ERROR RecoverableZooKeeper:221 - ZooKeeper exists
> failed
> after 3 retries
> 2013-07-19 14:15:45 WARN  ZKUtil:453 - hconnection Unable to set watcher on
> znode (/hbase/hbaseid)
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/hbaseid
>         at
> org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
>         at
> org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
>         at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1041)
>         at
>
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:172)
>         at
> org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:450)
>         at
>
> org.apache.hadoop.hbase.zookeeper.ClusterId.readClusterIdZNode(ClusterId.java:61)
>         at
> org.apache.hadoop.hbase.zookeeper.ClusterId.getId(ClusterId.java:50)
>         at
> org.apache.hadoop.hbase.zookeeper.ClusterId.hasId(ClusterId.java:44)
>         at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.ensureZookeeperTrackers(HConnectionManager.java:615)
>         at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:871)
>         at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:846)
>         at
> org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:271)
>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:211)
>         at
>
> org.apache.hadoop.hbase.client.HTableFactory.createHTableInterface(HTableFactory.java:36)
>         at
> org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:265)
>         at
>
> org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:195)
>         at
> org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:174)
>         at
>
> com.cap1.hbase.lookup.HBaseProcessImpl.processTrnx(HBaseProcessImpl.java:87)
>         at
>
> com.cap1.hbase.lookup.HBaseProcessImpl.processRecord(HBaseProcessImpl.java:64)
>         at
> com.cap1.hbase.callable.HBaseCallable.processTrxns(HBaseCallable.java:47)
>         at
> com.cap1.hbase.callable.HBaseCallable.call(HBaseCallable.java:38)
>         at com.cap1.hbase.callable.HBaseCallable.call(HBaseCallable.java:1)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> </pre>
>
>
> Below is the log when same jar file accessed as a standalone java program.
>
> <pre>
> 2013-07-19 14:22:28 INFO  class:68 -
> *************************************************
> 2013-07-19 14:22:28 INFO  class:69 - HBase configuration info ::
> zookeeperQuorum -->192.168.56.101, zookeeperClientPort -->2181, hbaseMaster
> -->192.168.56.101:60000
> 2013-07-19 14:22:28 INFO  class:72 -
> *************************************************
> 2013-07-19 14:22:29 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with
> annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
> about=, value=[Rate of successful kerberos logins and latency
> (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
> 2013-07-19 14:22:29 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with
> annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,
> about=, value=[Rate of failed kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2013-07-19 14:22:29 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
> group
> related metrics
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
> details.
> 2013-07-19 14:22:29 DEBUG Groups:180 -  Creating new Groups object
> 2013-07-19 14:22:29 DEBUG NativeCodeLoader:46 - Trying to load the
> custom-built native-hadoop library...
> 2013-07-19 14:22:29 DEBUG NativeCodeLoader:55 - Failed to load
> native-hadoop
> with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
> 2013-07-19 14:22:29 DEBUG NativeCodeLoader:56 -
>
> java.library.path=C:\Oracle\Middleware\jdk160_29\bin;C:\WINDOWS\Sun\Java\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program
> Files (x86)\Microsoft Application Virtualization
> Client;C:\Java\jrockit-jdk1.6.0_37-R28.2.5-4.1.0;C:\Program
>
> Files\TortoiseSVN\bin;C:\Oracl;C:\Users\user123\AppData\Local\Enthought\Canopy32\User\Scripts;C:\Softwares\npp.6.3.2.bin\unicode\notepad++.exe;C:\Python\Python27;C:\Softwares\apache-maven-3.0.5\bin;C:\Softwares\depot_tools;C:\Softwares\apache-ant-1.9.1\bin;;.
> 2013-07-19 14:22:29 WARN  NativeCodeLoader:62 - Unable to load
> native-hadoop
> library for your platform... using builtin-java classes where applicable
> 2013-07-19 14:22:29 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
> Falling
> back to shell based
> 2013-07-19 14:22:29 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
> mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
> 2013-07-19 14:22:29 DEBUG Groups:66 - Group mapping
> impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
> cacheTimeout=300000
> 2013-07-19 14:22:29 DEBUG UserGroupInformation:175 - hadoop login
> 2013-07-19 14:22:29 DEBUG UserGroupInformation:124 - hadoop login commit
> 2013-07-19 14:22:29 DEBUG UserGroupInformation:154 - using local
> user:NTUserPrincipal: user123
> 2013-07-19 14:22:29 DEBUG UserGroupInformation:697 - UGI loginUser:user123
> (auth:SIMPLE)
> 2013-07-19 14:22:29 DEBUG ZKUtil:120 - hconnection opening connection to
> ZooKeeper with ensemble (192.168.56.101:2181)
> 2013-07-19 14:22:29 INFO  RecoverableZooKeeper:104 - The identifier of this
> process is 50384@WIN75CB245190F
> 2013-07-19 14:22:29 DEBUG ZooKeeperWatcher:273 - hconnection Received
> ZooKeeper Event, type=None, state=SyncConnected, path=null
> 2013-07-19 14:22:29 DEBUG ZooKeeperWatcher:350 -
> hconnection-0x13ff7e7fc02000a connected
> 2013-07-19 14:22:29 DEBUG ZKUtil:1601 - hconnection-0x13ff7e7fc02000a
> Retrieved 36 byte(s) of data from znode /hbase/hbaseid;
> data=494de2c7-aa9c-4b24-9f2c-90ba6...
> 2013-07-19 14:22:29 DEBUG ZKUtil:423 - hconnection-0x13ff7e7fc02000a Set
> watcher on existing znode /hbase/master
> 2013-07-19 14:22:29 DEBUG ZKUtil:1601 - hconnection-0x13ff7e7fc02000a
> Retrieved 36 byte(s) of data from znode /hbase/master and set watcher;
> \x00\x00ubuntu-12.04.2,47663,...
> 2013-07-19 14:22:29 DEBUG ZKUtil:423 - hconnection-0x13ff7e7fc02000a Set
> watcher on existing znode /hbase/root-region-server
> 2013-07-19 14:22:29 DEBUG ZKUtil:1601 - hconnection-0x13ff7e7fc02000a
> Retrieved 34 byte(s) of data from znode /hbase/root-region-server and set
> watcher; ubuntu-12.04.2,50780,13742537...
> 2013-07-19 14:22:29 DEBUG HBaseRPC:102 - Using RpcEngine:
> org.apache.hadoop.hbase.ipc.WritableRpcEngine
> 2013-07-19 14:22:29 DEBUG HBaseClient:860 - The ping interval is60000ms.
> 2013-07-19 14:22:29 DEBUG ZKUtil:1601 - hconnection-0x13ff7e7fc02000a
> Retrieved 34 byte(s) of data from znode /hbase/root-region-server and set
> watcher; ubuntu-12.04.2,50780,13742537...
> 2013-07-19 14:22:29 DEBUG HConnectionManager$HConnectionImplementation:875
> -
> Looked up root region location,
>
> connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@1077fc9
> ;
> serverName=ubuntu-12.04.2,50780,1374253744820
> 2013-07-19 14:22:29 DEBUG HBaseClient:434 - Connecting to
> org.apache.hadoop.hbase.ipc.HBaseClient$ConnectionId@430581cd
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #0
> 2013-07-19 14:22:29 DEBUG HBaseClient:575 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123: starting, having
> connections 1
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #0
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: getProtocolVersion 36
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #1
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #1
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: getClosestRowBefore 5
> 2013-07-19 14:22:29 DEBUG HConnectionManager$HConnectionImplementation:1266
> - Cached location for .META.,,1.1028785192 is ubuntu-12.04.2:50780
> 2013-07-19 14:22:29 WARN  Configuration:824 - hadoop.native.lib is
> deprecated. Instead, use io.native.lib.available
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #2
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #2
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: getClosestRowBefore 3
> 2013-07-19 14:22:29 DEBUG MetaScanner:200 - Scanning .META. starting at
> row=HBASE_TEST_TABLE,,00000000000000 for max=10 rows using
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@1077fc9
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #3
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #3
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: openScanner 3
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #4
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #4
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: next 6
> 2013-07-19 14:22:29 DEBUG HConnectionManager$HConnectionImplementation:1266
> - Cached location for
> HBASE_TEST_TABLE,,1373942516744.9243ede59ac4a179cae677d7c4ca964c. is
> ubuntu-12.04.2:50780
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #5
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #5
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: close 2
> 2013-07-19 14:22:29 INFO  HBaseProcessImpl:145 -
> ****************************************************************
> 2013-07-19 14:22:29 INFO  HBaseProcessImpl:146 - Start row key -->
> 8f064a8b5da9b68adbaf0db302363dee, Stop row key -->
> 8f064a8b5da9b68adbaf0db302363def
> 2013-07-19 14:22:29 INFO  HBaseProcessImpl:148 -
> *****************************************************************
> 2013-07-19 14:22:29 INFO  HBaseProcessImpl:251 -
> ***********************************************
> 2013-07-19 14:22:29 INFO  HBaseProcessImpl:252 - Filter info:: FilterList
> OR
> (5/6): [SingleColumnValueFilter (ca, city, EQUAL, Chicago),
> SingleColumnValueFilter (ca, state, EQUAL, IL), SingleColumnValueFilter
> (ca,
> country, EQUAL, ), SingleColumnValueFilter (ca, mcc, EQUAL, 5814),
> SingleColumnValueFilter (ca, terminal_owner, EQUAL, )]
> 2013-07-19 14:22:29 INFO  HBaseProcessImpl:253 -
> ***********************************************
> 2013-07-19 14:22:29 DEBUG ClientScanner:90 - Creating scanner over
> HBASE_TEST_TABLE starting at key '8f064a8b5da9b68adbaf0db302363dee'
> 2013-07-19 14:22:29 DEBUG ClientScanner:198 - Advancing internal scanner to
> startKey at '8f064a8b5da9b68adbaf0db302363dee'
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #6
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #6
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: openScanner 3
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #7
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #7
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: next 5
> 2013-07-19 14:22:29 DEBUG HBaseClient:608 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 sending #8
> 2013-07-19 14:22:29 DEBUG HBaseClient:649 - IPC Client (6888942) connection
> to ubuntu-12.04.2/192.168.56.101:50780 from user123 got value #8
> 2013-07-19 14:22:29 DEBUG RPCEngine:92 - Call: close 1
> 2013-07-19 14:22:29 DEBUG ClientScanner:185 - Finished with scanning at
> {NAME =>
> 'HBASE_TEST_TABLE,,1373942516744.9243ede59ac4a179cae677d7c4ca964c.',
> STARTKEY => '', ENDKEY => '', ENCODED => 9243ede59ac4a179cae677d7c4ca964c,}
> 2013-07-19 14:22:29 INFO  TaskExecutorService:87 -  getActiveCount:0
> getCompletedTaskCount:1 getCorePoolSize:10 getLargestPoolSize:10
> getPoolSize:10 getTaskCount:1 getQueue().size():0
> 2013-07-19 14:22:29 INFO  CustomExecutorPool:143 - added executor back to
> pool
> 2013-07-19 14:22:29 INFO  DefaultRequestHandler:59 - Successfully processed
> payLoad:: d7ca6619-cee6-498f-91a8-bb18de7776c8 with size::1
> 2013-07-19 14:22:29 INFO  MerchantInfoLookup:126 - Request Identifier::
> d7ca6619-cee6-498f-91a8-bb18de7776c8
> 2013-07-19 14:22:29 INFO  MerchantInfoLookup:127 - Response list size:: 1
> 2013-07-19 14:22:29 INFO  MerchantInfoLookup:128 - ******************
> OUTPUT
> RECORD ****************************
> </pre>
>
>
>
> --
> View this message in context:
> http://apache-hbase.679495.n3.nabble.com/Accessing-HBase-using-java-native-api-from-weblogic-tp4048151.html
> Sent from the HBase User mailing list archive at Nabble.com.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message