Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8A8B3D7BE for ; Mon, 26 Nov 2012 22:45:33 +0000 (UTC) Received: (qmail 34001 invoked by uid 500); 26 Nov 2012 22:45:31 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 33942 invoked by uid 500); 26 Nov 2012 22:45:31 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 33933 invoked by uid 99); 26 Nov 2012 22:45:31 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 26 Nov 2012 22:45:31 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of svarma.ng@gmail.com designates 209.85.210.169 as permitted sender) Received: from [209.85.210.169] (HELO mail-ia0-f169.google.com) (209.85.210.169) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 26 Nov 2012 22:45:24 +0000 Received: by mail-ia0-f169.google.com with SMTP id r4so10499839iaj.14 for ; Mon, 26 Nov 2012 14:45:04 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type:content-transfer-encoding; bh=pxyfV4SqvohB1l+zDxr14YauBGWyNF5bpboXJnmxoeI=; b=Y00Q3ilgDJEZpQCI5POBKTza054fRwAmPvUoBwaaofoBkQlzYXKsUIOp30kmSMj47i krQ7ZEICJM/pm1CRFNNEt7QWxPL8xABfhT7KzkqI2WbvEixLcus2a7jnaxUKOLoeQvrU yJTOGnHYONjHyB70FqM2hQ7AP6C8Cnmp13+LtLD9/TPmJYPgavW+q0VYwqW06RqyvnsZ +967Y0NAXwu1fnjRWPKKByI8hb0sGBJCAPPaCr+XrFSI3Z++jiCm6rAsNZWSlj7ctait umY52xIRf+NBvAo4c7LaJQnqmkxnuC/hIHgtUaHu0tIQyC1gnEOYhG1zBS+E2SoCEGOc ltzQ== MIME-Version: 1.0 Received: by 10.50.220.166 with SMTP id px6mr16018485igc.8.1353969903857; Mon, 26 Nov 2012 14:45:03 -0800 (PST) Received: by 10.64.8.178 with HTTP; Mon, 26 Nov 2012 14:45:03 -0800 (PST) In-Reply-To: References: Date: Mon, 26 Nov 2012 14:45:03 -0800 Message-ID: Subject: Re: Runs in Eclipse but not as a Jar From: Suraj Varma To: user@hbase.apache.org Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable X-Virus-Checked: Checked by ClamAV on apache.org The difference is your classpath. So -for problem 1, you need to specify jars under /hbase-0.94.2/lib to your classpath. You only need a subset ... but first to get over the problem set your classpath with all these jars. I don't think specifying a wildcard "*" works ... like below ngc@hadoop1:~/hadoop-1.0.4$ bin/hadoop jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable -classpath "/home/ngc/hbase-0.94.2/*" you can use bin/hbase classpath to print out full classpath that you can include in your command line script ... In addition to the jars,you also need to add your hbase-site.xml (client side) to the classpath. This would be your problem 2. Hope that helps. --Suraj On Mon, Nov 26, 2012 at 1:03 PM, Ratner, Alan S (IS) wrote: > I am running HBase 0.94.2 running on 6 servers with Zookeeper 3.4.5 runni= ng on 3. HBase works from its shell and from within Eclipse but not as a j= ar file. When I run within Eclipse I can see it worked properly by using t= he HBase shell commands (such as scan). > > > > I seem to have 2 separate problems. > > > > Problem 1: when I create a jar file from Eclipse it won't run at all: > > ngc@hadoop1:~/hadoop-1.0.4$ bin/hadoo= p jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable -classpath "/home/ng= c/hbase-0.94.2/*" > > Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/had= oop/hbase/HBaseConfiguration at HBase.CreateBiTable.run(CreateBiTable.java:= 26) [line 26 is: Configuration conf =3D HBaseConfiguration.create();] > > > > Problem 2: when I create a "runnable" jar file from Eclipse it communicat= es with Zookeeper but then dies with: > > Exception in thread "main" java.lang.IllegalArgumentException: Not a host= :port pair: \ufffd > > 5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468 > > > > I'd prefer to use a regular jar (5 KB) rather than a runnable jar (100 MB= ). But I assume that if I fix Problem 1 then it will proceed until it cras= hes with Problem 2. > > > > Thanks in advance for any suggestions --- Alan. > > > > ----------------------------- > > CLASSPATH > > ngc@hadoop1:~/hadoop-1.0.4$ env | gre= p CLASSPATH CLASSPATH=3D/home/ngc/hadoop-1.0.4:/home/ngc/hbase-0.94.2/bin:/= home/ngc/zookeeper-3.4.5/bin:/home/ngc/accumulo-1.3.5-incubating > > > > ----------------------------- > > HBASE PROGRAM > > package HBase; > > > > import org.apache.hadoop.conf.Configuration; > > import org.apache.hadoop.conf.Configured; > > import org.apache.hadoop.hbase.HBaseConfiguration; > > import org.apache.hadoop.hbase.HColumnDescriptor; > > import org.apache.hadoop.hbase.HTableDescriptor; > > import org.apache.hadoop.hbase.client.HBaseAdmin; > > import org.apache.hadoop.util.Tool; > > import org.apache.hadoop.util.ToolRunner; > > > > public class CreateBiTable extends Configured implements Tool { > > public static String TableName =3D new String ("BiIPTable"); > > public static String cf =3D "cf"; //column family > > public static String c1 =3D "c1"; //column1 > > > > public static void main(String[] args) throws Exception { > > long startTime =3D System.currentTimeMillis(); > > int res =3D ToolRunner.run(new Configuration(), n= ew CreateBiTable(), args); > > double duration =3D (System.currentTimeMillis() -= startTime)/1000.0; > > System.out.println(">>>> Job Finished in " + dura= tion + " seconds"); > > System.exit(res); > > } > > > > public int run(String[] arg0) throws Exception { > > Configuration conf =3D HBaseConfiguration.create(); // Syste= m.out.println("Configuration created"); > > System.out.println("\t"+conf.toString()); > > HBaseAdmin admin =3D new HBaseAdmin(conf); // System.out.print= ln("\t"+admin.toString()); > > if (admin.tableExists(TableName)) { > > // Disable and delete the table if it exists > > admin.disableTable(TableName); > > admin.deleteTable(TableName); > > System.out.println(TableName+" exists so deleted"); > > } > > // Create table > > HTableDescriptor htd =3D new HTableDescriptor(TableName); > > HColumnDescriptor hcd =3D new HColumnDescriptor(cf); > > htd.addFamily(hcd); > > admin.createTable(htd); > > System.out.println("Table created: "+htd); > > // Does the table exist now? > > if (admin.tableExists(TableName)) > > System.out.println(TableName+" creation succeeded"); > > else > > System.out.println(TableName+" creation failed"); > > return 0; > > } > > } > > > > ----------------------------- > > OUTPUT FROM RUNNING WITHIN ECLIPSE > > Configuration: core-default.xml, core-site.xml, hbase-default= .xml, hbase-site.xml > > SLF4J: Class path contains multiple SLF4J bindings. > > SLF4J: Found binding in [jar:file:/home/ngc/mahout-distribution-0.7/mahou= t-examples-0.7-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: Found binding in [jar:file:/home/ngc/hadoop-1.0.4/lib/slf4j-log4j1= 2-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an expla= nation. > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:zookeeper.= version=3D3.4.5-1392090, built on 09/30/2012 17:52 GMT > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:host.name= =3Dhadoop1.aj.c2fse.northgrum.com > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.versi= on=3D1.6.0_25 > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.vendo= r=3DSun Microsystems Inc. > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.home= =3D/home/ngc/jdk1.6.0_25/jre > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.class= .path=3D/home/ngc/AlanSpace/HadoopPrograms/bin:/home/ngc/hadoop-1.0.4/hadoo= p-core-1.0.4.jar:/home/ngc/zookeeper-3.4.5/zookeeper-3.4.5.jar:/home/ngc/Ja= vaLibraries/Jama/Jama-1.0.2.jar:/home/ngc/AlansOpenCVStuff/core.jar:/home/n= gc/OpenCV-2.2.0/javacv/javacpp.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-lin= ux-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-linux-x86_64.jar:/home/ngc/= OpenCV-2.2.0/javacv/javacv-macosx-x86_64.jar:/home/ngc/OpenCV-2.2.0/javacv/= javacv-windows-x86.jar:/home/ngc/OpenCV-2.2.0/javacv/javacv-windows-x86_64.= jar:/home/ngc/OpenCV-2.2.0/javacv/javacv.jar:/home/ngc/OpenCV-2.2.0/lib:/ho= me/ngc/javafaces/lib/colt.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/= commons-math3-3.0.jar:/home/ngc/AlansOpenCVStuff/commons-math3-3.0/commons-= math3-3.0-javadoc.jar:/home/ngc/Downloads/jtransforms-2.4.jar:/home/ngc/mah= out-distribution-0.7/mahout-core-0.7.jar:/home/ngc/mahout-distribution-0.7/= mahout-core-0.7-job.jar:/home/ngc/mahout-distribution-0.7/mahout-integratio= n-0.7.jar:/home/ngc/hbase-0.94.2/hbase-0.94.2.jar:/home/ngc/mahout-distribu= tion-0.7/mahout-math-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examp= les-0.7.jar:/home/ngc/mahout-distribution-0.7/mahout-examples-0.7-job.jar:/= home/ngc/mahout-distribution-0.7/lib/commons-cli-2.0-mahout.jar:/home/ngc/m= ahout-distribution-0.7/lib/uncommons-maths-1.2.2.jar:/home/ngc/pig-0.10.0/p= ig-0.10.0.jar:/home/ngc/Cascading/cascading-core-2.1.0-wip-76.jar:/home/ngc= /Cascading/cascading-hadoop-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-= local-2.1.0-wip-76.jar:/home/ngc/Cascading/cascading-xml-2.1.0-wip-76.jar:/= home/ngc/hadoop-1.0.4/hadoop-ant-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-cl= ient-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-examples-1.0.4.jar:/home/ngc/h= adoop-1.0.4/hadoop-minicluster-1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-test= -1.0.4.jar:/home/ngc/hadoop-1.0.4/hadoop-tools-1.0.4.jar:/home/ngc/hadoop-1= .0.4/lib/asm-3.2.jar:/home/ngc/hadoop-1.0.4/lib/aspectjrt-1.6.5.jar:/home/n= gc/hadoop-1.0.4/lib/aspectjtools-1.6.5.jar:/home/ngc/hadoop-1.0.4/lib/commo= ns-beanutils-1.7.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-beanutils-core-1.= 8.0.jar:/home/ngc/hadoop-1.0.4/lib/commons-cli-1.2.jar:/home/ngc/hadoop-1.0= .4/lib/commons-codec-1.4.jar:/home/ngc/hadoop-1.0.4/lib/commons-collections= -3.2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-configuration-1.6.jar:/home/n= gc/hadoop-1.0.4/lib/commons-daemon-1.0.1.jar:/home/ngc/hadoop-1.0.4/lib/com= mons-digester-1.8.jar:/home/ngc/hadoop-1.0.4/lib/commons-el-1.0.jar:/home/n= gc/hadoop-1.0.4/lib/commons-io-2.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-l= ogging-1.1.1.jar:/home/ngc/hadoop-1.0.4/lib/commons-logging-api-1.0.4.jar:/= home/ngc/hadoop-1.0.4/lib/commons-math-2.1.jar:/home/ngc/hadoop-1.0.4/lib/c= ommons-net-1.4.1.jar:/home/ngc/hadoop-1.0.4/lib/core-3.1.1.jar:/home/ngc/ha= doop-1.0.4/lib/hadoop-capacity-scheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/l= ib/hadoop-fairscheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hadoop-thriftf= s-1.0.4.jar:/home/ngc/hadoop-1.0.4/lib/hsqldb-1.8.0.10.jar:/home/ngc/hadoop= -1.0.4/lib/jackson-core-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/lib/jackson-ma= pper-asl-1.8.8.jar:/home/ngc/hadoop-1.0.4/lib/jdeb-0.8.jar:/home/ngc/hadoop= -1.0.4/lib/jersey-core-1.8.jar:/home/ngc/hadoop-1.0.4/lib/jersey-json-1.8.j= ar:/home/ngc/hadoop-1.0.4/lib/jersey-server-1.8.jar:/home/ngc/hadoop-1.0.4/= lib/jets3t-0.6.1.jar:/home/ngc/hadoop-1.0.4/lib/jetty-6.1.26.jar:/home/ngc/= hadoop-1.0.4/lib/jetty-util-6.1.26.jar:/home/ngc/hadoop-1.0.4/lib/jsch-0.1.= 42.jar:/home/ngc/hadoop-1.0.4/lib/junit-4.5.jar:/home/ngc/hadoop-1.0.4/lib/= kfs-0.2.2.jar:/home/ngc/hadoop-1.0.4/lib/mockito-all-1.8.5.jar:/home/ngc/ha= doop-1.0.4/lib/oro-2.0.8.jar:/home/ngc/hadoop-1.0.4/lib/slf4j-api-1.4.3.jar= :/home/ngc/hadoop-1.0.4/lib/slf4j-log4j12-1.4.3.jar:/home/ngc/hadoop-1.0.4/= lib/xmlenc-0.52.jar:/home/ngc/Data/SchemaB.jar:/home/ngc/hive/lib/hive-exec= -0.7.0.jar:/home/ngc/hbase-0.94.2/hbase-0.94.2-tests.jar:/home/ngc/hbase-0.= 94.2/lib/activation-1.1.jar:/home/ngc/hbase-0.94.2/lib/avro-1.5.3.jar:/home= /ngc/hbase-0.94.2/lib/avro-ipc-1.5.3.jar:/home/ngc/hbase-0.94.2/lib/commons= -digester-1.8.jar:/home/ngc/hbase-0.94.2/lib/commons-httpclient-3.1.jar:/ho= me/ngc/hbase-0.94.2/lib/commons-lang-2.5.jar:/home/ngc/hbase-0.94.2/lib/gua= va-11.0.2.jar:/home/ngc/hbase-0.94.2/lib/high-scale-lib-1.1.1.jar:/home/ngc= /hbase-0.94.2/lib/httpclient-4.1.2.jar:/home/ngc/hbase-0.94.2/lib/httpcore-= 4.1.3.jar:/home/ngc/hbase-0.94.2/lib/jackson-jaxrs-1.8.8.jar:/home/ngc/hbas= e-0.94.2/lib/jackson-xc-1.8.8.jar:/home/ngc/hbase-0.94.2/lib/jamon-runtime-= 2.3.1.jar:/home/ngc/hbase-0.94.2/lib/jasper-compiler-5.5.23.jar:/home/ngc/h= base-0.94.2/lib/jasper-runtime-5.5.23.jar:/home/ngc/hbase-0.94.2/lib/jaxb-a= pi-2.1.jar:/home/ngc/hbase-0.94.2/lib/jaxb-impl-2.2.3-1.jar:/home/ngc/hbase= -0.94.2/lib/jettison-1.1.jar:/home/ngc/hbase-0.94.2/lib/jruby-complete-1.6.= 5.jar:/home/ngc/hbase-0.94.2/lib/jsp-2.1-6.1.14.jar:/home/ngc/hbase-0.94.2/= lib/jsp-api-2.1-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/jsr305-1.3.9.jar:/hom= e/ngc/hbase-0.94.2/lib/junit-4.10-HBASE-1.jar:/home/ngc/hbase-0.94.2/lib/li= bthrift-0.8.0.jar:/home/ngc/hbase-0.94.2/lib/log4j-1.2.16.jar:/home/ngc/hba= se-0.94.2/lib/metrics-core-2.1.2.jar:/home/ngc/hbase-0.94.2/lib/netty-3.2.4= .Final.jar:/home/ngc/hbase-0.94.2/lib/protobuf-java-2.4.0a.jar:/home/ngc/hb= ase-0.94.2/lib/servlet-api-2.5-6.1.14.jar:/home/ngc/hbase-0.94.2/lib/snappy= -java-1.0.3.2.jar:/home/ngc/hbase-0.94.2/lib/stax-api-1.0.1.jar:/home/ngc/h= base-0.94.2/lib/velocity-1.7.jar > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.libra= ry.path=3D/home/ngc/AlansOpenCVStuff > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.io.tm= pdir=3D/tmp > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:java.compi= ler=3D > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.name=3D= Linux > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.arch=3D= amd64 > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:os.version= =3D3.2.0-24-generic > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.name= =3Dngc > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.home= =3D/home/ngc > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Client environment:user.dir= =3D/home/ngc/AlanSpace/HadoopPrograms > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Initiating client connection,= connectString=3Dlocalhost:2181 sessionTimeout=3D180000 watcher=3Dhconnecti= on > > 12/11/26 13:48:54 INFO zookeeper.RecoverableZooKeeper: The identifier of = this process is 23098@hadoop1 > > 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Opening socket connection to= server localhost/127.0.0.1:2181. Will not attempt to authenticate using SA= SL (Unable to locate a login configuration) > > 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Socket connection establishe= d to localhost/127.0.0.1:2181, initiating session > > 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Session establishment comple= te on server localhost/127.0.0.1:2181, sessionid =3D 0x13b3dab1196001f, neg= otiated timeout =3D 40000 > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Initiating client connection,= connectString=3Dlocalhost:2181 sessionTimeout=3D180000 watcher=3Dcatalogtr= acker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImple= mentation@30ff8c74 > > 12/11/26 13:48:54 INFO zookeeper.RecoverableZooKeeper: The identifier of = this process is 23098@hadoop1 > > 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Opening socket connection to= server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate us= ing SASL (Unable to locate a login configuration) > > 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Socket connection establishe= d to localhost/0:0:0:0:0:0:0:1:2181, initiating session > > 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: Session establishment comple= te on server localhost/0:0:0:0:0:0:0:1:2181, sessionid =3D 0x13b3dab1196002= 0, negotiated timeout =3D 40000 > > 12/11/26 13:48:54 INFO zookeeper.ZooKeeper: Session: 0x13b3dab11960020 cl= osed > > 12/11/26 13:48:54 INFO zookeeper.ClientCnxn: EventThread shut down > > 12/11/26 13:48:54 INFO client.HBaseAdmin: Started disable of BiIPTable > > 12/11/26 13:48:56 INFO client.HBaseAdmin: Disabled BiIPTable > > 12/11/26 13:48:57 INFO client.HBaseAdmin: Deleted BiIPTable BiIPTable exi= sts so deleted Table created: {NAME =3D> 'BiIPTable', FAMILIES =3D> [{NAME = =3D> 'cf', DATA_BLOCK_ENCODING =3D> 'NONE', BLOOMFILTER =3D> 'NONE', REPLIC= ATION_SCOPE =3D> '0', COMPRESSION =3D> 'NONE', VERSIONS =3D> '3', TTL =3D> = '2147483647', MIN_VERSIONS =3D> '0', KEEP_DELETED_CELLS =3D> 'false', BLOCK= SIZE =3D> '65536', ENCODE_ON_DISK =3D> 'true', IN_MEMORY =3D> 'false', BLOC= KCACHE =3D> 'true'}]} > > 12/11/26 13:48:59 INFO zookeeper.ZooKeeper: Initiating client connection,= connectString=3Dlocalhost:2181 sessionTimeout=3D180000 watcher=3Dcatalogtr= acker-on-org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImple= mentation@30ff8c74 > > 12/11/26 13:48:59 INFO zookeeper.RecoverableZooKeeper: The identifier of = this process is 23098@hadoop1 > > 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Opening socket connection to= server localhost/127.0.0.1:2181. Will not attempt to authenticate using SA= SL (Unable to locate a login configuration) > > 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Socket connection establishe= d to localhost/127.0.0.1:2181, initiating session > > 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: Session establishment comple= te on server localhost/127.0.0.1:2181, sessionid =3D 0x13b3dab11960021, neg= otiated timeout =3D 40000 > > 12/11/26 13:48:59 INFO zookeeper.ZooKeeper: Session: 0x13b3dab11960021 cl= osed > > 12/11/26 13:48:59 INFO zookeeper.ClientCnxn: EventThread shut down BiIPTa= ble creation succeeded > >>>>> Job Finished in 5.177 seconds > > > > ----------------------------- > > OUTPUT FROM REGULAR JAR FILE > > ngc@hadoop1:~/hadoop-1.0.4$ bin/hadoo= p jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable Exception in thread = "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfigu= ration > > at HBase.CreateBiTable.run(CreateBiTable.java:26) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > > at HBase.CreateBiTable.main(CreateBiTable.java:19) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method= ) > > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAc= cessorImpl.java:39) > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegating= MethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > > Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBas= eConfiguration > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > > at java.security.AccessController.doPrivileged(Native Method) > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247) > > ... 8 more > > > > ----------------------------- > > OUTPUT FROM RUNABLE JAR FILE > > ngc@hadoop1:~/hadoop-1.0.4$ bin/hadoo= p jar ../eclipse/CreateBiTable.jar HBase/CreateBiTable > > Configuration: core-default.xml, core-site.xml, hbase-default= .xml, hbase-site.xml > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:zookeeper.= version=3D3.3.3-1073969, built on 02/23/2011 22:27 GMT > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:host.name= =3Dhadoop1.aj.c2fse.northgrum.com > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.versi= on=3D1.6.0_25 > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.vendo= r=3DSun Microsystems Inc. > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.home= =3D/home/ngc/jdk1.6.0_25/jre > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.class= .path=3D/home/ngc/hadoop-1.0.4/libexec/../conf:/home/ngc/jdk1.6.0_25/lib/to= ols.jar:/home/ngc/hadoop-1.0.4/libexec/..:/home/ngc/hadoop-1.0.4/libexec/..= /hadoop-core-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/asm-3.2.jar:/h= ome/ngc/hadoop-1.0.4/libexec/../lib/aspectjrt-1.6.5.jar:/home/ngc/hadoop-1.= 0.4/libexec/../lib/aspectjtools-1.6.5.jar:/home/ngc/hadoop-1.0.4/libexec/..= /lib/commons-beanutils-1.7.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/comm= ons-beanutils-core-1.8.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-= cli-1.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-codec-1.4.jar:/ho= me/ngc/hadoop-1.0.4/libexec/../lib/commons-collections-3.2.1.jar:/home/ngc/= hadoop-1.0.4/libexec/../lib/commons-configuration-1.6.jar:/home/ngc/hadoop-= 1.0.4/libexec/../lib/commons-daemon-1.0.1.jar:/home/ngc/hadoop-1.0.4/libexe= c/../lib/commons-digester-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/com= mons-el-1.0.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-httpclient-3.= 0.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-io-2.1.jar:/home/ngc/= hadoop-1.0.4/libexec/../lib/commons-lang-2.4.jar:/home/ngc/hadoop-1.0.4/lib= exec/../lib/commons-logging-1.1.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib= /commons-logging-api-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/common= s-math-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/commons-net-1.4.1.jar:= /home/ngc/hadoop-1.0.4/libexec/../lib/core-3.1.1.jar:/home/ngc/hadoop-1.0.4= /libexec/../lib/hadoop-capacity-scheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/= libexec/../lib/hadoop-fairscheduler-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexe= c/../lib/hadoop-thriftfs-1.0.4.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/hs= qldb-1.8.0.10.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jackson-core-asl-1.= 8.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jackson-mapper-asl-1.8.8.jar:= /home/ngc/hadoop-1.0.4/libexec/../lib/jasper-compiler-5.5.12.jar:/home/ngc/= hadoop-1.0.4/libexec/../lib/jasper-runtime-5.5.12.jar:/home/ngc/hadoop-1.0.= 4/libexec/../lib/jdeb-0.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-= core-1.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jersey-json-1.8.jar:/hom= e/ngc/hadoop-1.0.4/libexec/../lib/jersey-server-1.8.jar:/home/ngc/hadoop-1.= 0.4/libexec/../lib/jets3t-0.6.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/j= etty-6.1.26.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jetty-util-6.1.26.jar= :/home/ngc/hadoop-1.0.4/libexec/../lib/jsch-0.1.42.jar:/home/ngc/hadoop-1.0= .4/libexec/../lib/junit-4.5.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/kfs-0= .2.2.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/log4j-1.2.15.jar:/home/ngc/h= adoop-1.0.4/libexec/../lib/mockito-all-1.8.5.jar:/home/ngc/hadoop-1.0.4/lib= exec/../lib/oro-2.0.8.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/servlet-api= -2.5-20081211.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/slf4j-api-1.4.3.jar= :/home/ngc/hadoop-1.0.4/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/ngc/ha= doop-1.0.4/libexec/../lib/xmlenc-0.52.jar:/home/ngc/hadoop-1.0.4/libexec/..= /lib/jsp-2.1/jsp-2.1.jar:/home/ngc/hadoop-1.0.4/libexec/../lib/jsp-2.1/jsp-= api-2.1.jar > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.libra= ry.path=3D/home/ngc/hadoop-1.0.4/libexec/../lib/native/Linux-amd64-64 > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.io.tm= pdir=3D/tmp > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:java.compi= ler=3D > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.name=3D= Linux > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.arch=3D= amd64 > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:os.version= =3D3.2.0-24-generic > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.name= =3Dngc > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.home= =3D/home/ngc > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Client environment:user.dir= =3D/home/ngc/hadoop-1.0.4 > > 12/11/26 14:15:17 INFO zookeeper.ZooKeeper: Initiating client connection,= connectString=3Dlocalhost:2181 sessionTimeout=3D180000 watcher=3Dhconnecti= on > > 12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Opening socket connection to= server localhost/127.0.0.1:2181 > > 12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Socket connection establishe= d to localhost/127.0.0.1:2181, initiating session > > 12/11/26 14:15:17 INFO zookeeper.ClientCnxn: Session establishment comple= te on server localhost/127.0.0.1:2181, sessionid =3D 0x13b3dab11960023, neg= otiated timeout =3D 40000 Exception in thread "main" java.lang.IllegalArgum= entException: Not a host:port pair: \ufffd > > 5800@hadoop1hadoop1.aj.c2fse.northgrum.com,60000,1353949574468 > > at org.apache.hadoop.hbase.HServerAddress.(HServerAddre= ss.java:60) > > at org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddr= ess(MasterAddressTracker.java:63) > > at org.apache.hadoop.hbase.client.HConnectionManager$HConnect= ionImplementation.getMaster(HConnectionManager.java:352) > > at org.apache.hadoop.hbase.client.HBaseAdmin.(HBaseAdmi= n.java:90) > > at HBase.CreateBiTable.run(CreateBiTable.java:29) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > > at HBase.CreateBiTable.main(CreateBiTable.java:19) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method= ) > > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAc= cessorImpl.java:39) > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegating= MethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > > > > Alan >