hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: java.lang.NoClassDefFoundError: com/google/protobuf/ZeroCopyLiteralByteString
Date Fri, 14 Nov 2014 15:49:44 GMT
bq.     <classpathentry kind="lib" path="lib/hbase-protocol.jar"/>

The above jar file doesn't have version in its name.
Can you confirm that it is the right jar ?

On Fri, Nov 14, 2014 at 6:13 AM, antarktika <net.tyumen@mail.ru> wrote:

> I try a simple java code to connect to hbase and get the error:
> java.lang.NoClassDefFoundError:
> com/google/protobuf/ZeroCopyLiteralByteString
>
> The part of the code:
>
> Configuration config = HBaseConfiguration.create();
> config.set("hbase.master", "myserver_ip_address:60000");
> config.set("hbase.zookeeper.quorum", "myserver_ip_address");
> config.set("hbase.zookeeper.property.clientPort", "2181");
> HBaseAdmin admin = new HBaseAdmin(config);
> HTableFactory factory = new HTableFactory();
> ...
>
> The error appears at the line HBaseAdmin admin = new HBaseAdmin(config);
>
> All libraries in the classpath are from hadopp and hbase lib folders on the
> server, so all versions of jars should be suitable.
>
> The result is:
>
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:zookeeper.version=3.4.5-cdh5.0.2--1, built on 06/09/2014 16:09
> GMT
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:host.name=myserver_name
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:java.version=1.7.0_45
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:java.vendor=Oracle Corporation
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:java.home=/usr/java/jdk1.7.0_45-cloudera/jre
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
>
> environment:java.class.path=/etc/hadoop/conf:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/libexec/../../hadoop/lib/activation-1.1.jar:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/libexec/../../hadoop/lib/jsp-api-2.1.jar:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/libexec/../../hadoop/lib/jackson-core-asl-1.8.8.jar:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/libexec/../../hadoop/lib/snappy-java-1.0.4.1.jar:
> (... other jars)
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
>
> environment:java.library.path=/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/lib/native
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:java.io.tmpdir=/tmp
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:java.compiler=<NA>
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client environment:os.name
> =Linux
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:os.arch=amd64
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:os.version=2.6.32-431.20.3.el6.x86_64
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client environment:user.name
> =tr
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:user.home=/home/tr
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Client
> environment:user.dir=/home/tr/test/codes
> 14/11/14 15:49:48 INFO zookeeper.ZooKeeper: Initiating client connection,
> connectString=myserver_name:2181 sessionTimeout=1000
> watcher=hconnection-0x69e017e4, quorum=myserver_name:2181, baseZNode=/hbase
> 14/11/14 15:49:48 INFO zookeeper.RecoverableZooKeeper: Process
> identifier=hconnection-0x69e017e4 connecting to ZooKeeper
> ensemble=myserver_name:2181
> 14/11/14 15:49:48 INFO zookeeper.ClientCnxn: Opening socket connection to
> server myserver_name/myserver_ip_address:2181. Will not attempt to
> authenticate using SASL (unknown error)
> 14/11/14 15:49:48 INFO zookeeper.ClientCnxn: Socket connection established
> to myserver_name/myserver_ip_address:2181, initiating session
> 14/11/14 15:49:48 INFO zookeeper.ClientCnxn: Session establishment complete
> on server myserver_name/myserver_ip_address:2181, sessionid =
> 0x1492e0f10597dd5, negotiated timeout = 4000
> HBaseAdmin variable is created
> 14/11/14 15:49:49 INFO zookeeper.ZooKeeper: Initiating client connection,
> connectString=myserver_name:2181 sessionTimeout=1000
> watcher=catalogtracker-on-hconnection-0x69e017e4,
> quorum=myserver_name:2181,
> baseZNode=/hbase
> 14/11/14 15:49:49 INFO zookeeper.RecoverableZooKeeper: Process
> identifier=catalogtracker-on-hconnection-0x69e017e4 connecting to ZooKeeper
> ensemble=myserver_name:2181
> 14/11/14 15:49:49 INFO zookeeper.ClientCnxn: Opening socket connection to
> server myserver_name/myserver_ip_address:2181. Will not attempt to
> authenticate using SASL (unknown error)
> 14/11/14 15:49:49 INFO zookeeper.ClientCnxn: Socket connection established
> to myserver_name/myserver_ip_address:2181, initiating session
> 14/11/14 15:49:49 INFO zookeeper.ClientCnxn: Session establishment complete
> on server myserver_name/myserver_ip_address:2181, sessionid =
> 0x1492e0f10597dd8, negotiated timeout = 4000
> 14/11/14 15:49:49 INFO Configuration.deprecation: hadoop.native.lib is
> deprecated. Instead, use io.native.lib.available
> 14/11/14 15:49:50 INFO zookeeper.ZooKeeper: Session: 0x1492e0f10597dd8
> closed
> 14/11/14 15:49:50 INFO zookeeper.ClientCnxn: EventThread shut down
> Exception in thread "main"
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> attempts=2, exceptions:
> Fri Nov 14 15:49:49 EET 2014,
> org.apache.hadoop.hbase.client.RpcRetryingCaller@12c5fb80,
> java.lang.NoClassDefFoundError:
> com/google/protobuf/ZeroCopyLiteralByteString
> Fri Nov 14 15:49:50 EET 2014,
> org.apache.hadoop.hbase.client.RpcRetryingCaller@12c5fb80,
> java.lang.NoClassDefFoundError:
> com/google/protobuf/ZeroCopyLiteralByteString
>
>         at
>
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:134)
>         at
>
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:96)
>         at
>
> org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:271)
>         at
>
> org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:176)
>         at
> org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:171)
>         at
> org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:110)
>         at
> org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:720)
>         at
> org.apache.hadoop.hbase.catalog.MetaReader.fullScan(MetaReader.java:538)
>         at
> org.apache.hadoop.hbase.catalog.MetaReader.tableExists(MetaReader.java:309)
>         at
> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:271)
>         at
> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:280)
>         at Sample.run(Sample.java:187)
>         at Main.main(Main.java:53)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> Caused by: java.lang.NoClassDefFoundError:
> com/google/protobuf/ZeroCopyLiteralByteString
>         at
>
> org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:897)
>         at
>
> org.apache.hadoop.hbase.protobuf.RequestConverter.buildScanRequest(RequestConverter.java:420)
>         at
>
> org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:297)
>         at
>
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:157)
>         at
>
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:57)
>         at
>
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:120)
>         ... 17 more
>
> .classpath included to runnable jar file:
> <classpath>
>         <classpathentry kind="src" path="src"/>
>         <classpathentry kind="src" path="conf"/>
>         <classpathentry kind="con"
> path="org.eclipse.jdt.launching.JRE_CONTAINER"/>
>         <classpathentry kind="lib" path="lib/commons-lang-2.6.jar"/>
>         <classpathentry kind="lib" path="lib/commons-logging-1.1.1.jar"/>
>         <classpathentry kind="lib"
> path="lib/hadoop-common-2.3.0-cdh5.0.2.jar"/>
>         <classpathentry kind="lib"
> path="lib/hbase-client-0.96.1.1-hadoop2.jar"/>
>         <classpathentry kind="lib"
> path="lib/hbase-common-0.96.1.1-cdh5.0.2.jar"/>
>         <classpathentry kind="lib" path="lib/hbase-protocol.jar"/>
>         <classpathentry kind="lib" path="lib/log4j-1.2.17.jar"/>
>         <classpathentry kind="lib" path="lib/protobuf-java-2.5.0.jar"/>
>         <classpathentry kind="lib"
> path="lib/zookeeper-3.4.5-cdh5.0.2.jar"/>
>         <classpathentry kind="lib" path="lib/htrace-core-2.01.jar"/>
>         <classpathentry kind="output" path="bin"/>
> </classpath>
>
> So I checked "protobuf-java-2.5.0.jar", there is no class
> ZeroCopyLiteralByteString. Why is it required?
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-hbase.679495.n3.nabble.com/java-lang-NoClassDefFoundError-com-google-protobuf-ZeroCopyLiteralByteString-tp4065954.html
> Sent from the HBase User mailing list archive at Nabble.com.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message