Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 801F2173BA for ; Tue, 23 Jun 2015 03:54:59 +0000 (UTC) Received: (qmail 92423 invoked by uid 500); 23 Jun 2015 03:54:57 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 92350 invoked by uid 500); 23 Jun 2015 03:54:57 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 92340 invoked by uid 99); 23 Jun 2015 03:54:57 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 23 Jun 2015 03:54:57 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of sanjiv.is.on@gmail.com designates 209.85.192.172 as permitted sender) Received: from [209.85.192.172] (HELO mail-pd0-f172.google.com) (209.85.192.172) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 23 Jun 2015 03:52:42 +0000 Received: by pdbki1 with SMTP id ki1so152219682pdb.1 for ; Mon, 22 Jun 2015 20:54:29 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:reply-to:in-reply-to:references:from:date:message-id :subject:to:content-type; bh=x0V5oNwrsfwEQPZiOChGbkkTMQ0ruPmil0YnuF1bIN0=; b=wpWzEI2Yl8PHWGZtOe2l027M0vsClfzw6LVtYTRzDuMXZwTvzl09K+9FlCPhLcGEUJ GRgAQvB27t4ZMvUq25SZFeyTT7En25bTRUh8lMZXrR/AF1AtjKz7Uwe0yX5/wzDfcHjX 1yy2F89tmPJcv0+5IGVrDIH2opRNfkk+viQp/JqiXbRrlBManSXW+E9IY9fxU0lA3r5z HYdZQNOIVBZjWkInD7JXcGYsEhA4aGE1OYCoWo6wJJdpOJWQyw5JFvEqBunl6dsDbaPk vQTv+VkI45mHuPINOMgcjjmua95gblYoC0CKAsJNgEjdFXKMNXFW3/DNNJT74sqIHHxg b7nw== X-Received: by 10.70.18.130 with SMTP id w2mr64826360pdd.73.1435031669247; Mon, 22 Jun 2015 20:54:29 -0700 (PDT) MIME-Version: 1.0 Received: by 10.70.111.168 with HTTP; Mon, 22 Jun 2015 20:54:09 -0700 (PDT) Reply-To: sanjiv.is.on@gmail.com In-Reply-To: References: From: "@Sanjiv Singh" Date: Tue, 23 Jun 2015 09:24:09 +0530 Message-ID: Subject: Re: Query Timeout To: "user@hive.apache.org" Content-Type: multipart/alternative; boundary=047d7bdc1a52e318340519275812 X-Virus-Checked: Checked by ClamAV on apache.org --047d7bdc1a52e318340519275812 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi Ibrar What is the value given for .... --hiveconf *hbase.master*=3D##### OR --hiveconf *hbase.zookeeper.quorum*=3D######## It seems from the error that HBase Server configuration are not correct. Regards Sanjiv Singh Mob : +091 9990-447-339 On Wed, Jun 17, 2015 at 4:31 PM, Ibrar Ahmed wrote: > I am able to fix that issue, but got another error > > > [127.0.0.1:10000] hive> CREATE TABLE IF NOT EXISTS pagecounts_hbase > (rowkey STRING, pageviews STRING, bytes STRING) STORED BY > 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES > ('hbase.columns.mapping' =3D ':key,f:c1,f:c2') TBLPROPERTIES (' > hbase.table.name' =3D 'pagecounts'); > [Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution > Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. > MetaException(message:java.lang.IllegalArgumentException: Not a host:port > pair: PBUF > " > ibrar-virtual-machine =EF=BF=BD=EF=BF=BD =EF=BF=BD=DF=AF=EF=BF=BD=EF=BF= =BD) =EF=BF=BD=EF=BF=BD > at > org.apache.hadoop.hbase.util.Addressing.parseHostname(Addressing.java:60) > at org.apache.hadoop.hbase.ServerName.(ServerName.java:96) > at > org.apache.hadoop.hbase.ServerName.parseVersionedServerName(ServerName.ja= va:278) > at > org.apache.hadoop.hbase.MasterAddressTracker.bytesToServerName(MasterAddr= essTracker.java:77) > at > org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddre= ssTracker.java:61) > at > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementati= on.getMaster(HConnectionManager.java:631) > at > org.apache.hadoop.hbase.client.HBaseAdmin.(HBaseAdmin.java:106) > > at > org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStora= geHandler.java:84) > at > org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStor= ageHandler.java:162) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMeta= StoreClient.java:554) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMeta= StoreClient.java:547) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingM= etaStoreClient.java:89) > at com.sun.proxy.$Proxy7.createTable(Unknown Source) > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613) > at > org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:8= 5) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870) > at > org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveS= erver.java:198) > at > org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(Thr= iftHive.java:644) > at > org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(Thr= iftHive.java:628) > at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolS= erver.java:206) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java= :1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.jav= a:615) > at java.lang.Thread.run(Thread.java:745) > > > On Wed, Jun 17, 2015 at 3:51 PM, Ibrar Ahmed > wrote: > >> Hi, >> >> Whats wrong with my settings? >> >> [127.0.0.1:10000] hive> CREATE TABLE IF NOT EXISTS pagecounts_hbase >> (rowkey STRING, pageviews STRING, bytes STRING) STORED BY >> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES >> ('hbase.columns.mapping' =3D ':key,f:c1,f:c2') TBLPROPERTIES (' >> hbase.table.name' =3D 'pagecounts'); >> >> [Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution >> Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. >> MetaException(message:MetaException(message:org.apache.hadoop.hbase.Mast= erNotRunningException: >> Retried 10 times >> at >> org.apache.hadoop.hbase.client.HBaseAdmin.(HBaseAdmin.java:127) >> at >> org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStor= ageHandler.java:84) >> at >> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseSto= rageHandler.java:162) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMet= aStoreClient.java:554) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMet= aStoreClient.java:547) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav= a:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor= Impl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at >> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retrying= MetaStoreClient.java:89) >> at com.sun.proxy.$Proxy7.createTable(Unknown Source) >> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613= ) >> at >> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194) >> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281) >> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) >> at >> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:= 85) >> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) >> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) >> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) >> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880) >> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870) >> at >> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(Hive= Server.java:198) >> at >> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(Th= riftHive.java:644) >> at >> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(Th= riftHive.java:628) >> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39= ) >> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) >> at >> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPool= Server.java:206) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav= a:1145) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja= va:615) >> at java.lang.Thread.run(Thread.java:745) >> ) >> at >> org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStor= ageHandler.java:88) >> at >> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseSto= rageHandler.java:162) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMet= aStoreClient.java:554) >> at >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMet= aStoreClient.java:547) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav= a:57) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor= Impl.java:43) >> at java.lang.reflect.Method.invoke(Method.java:606) >> at >> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(Retrying= MetaStoreClient.java:89) >> at com.sun.proxy.$Proxy7.createTable(Unknown Source) >> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613= ) >> at >> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194) >> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281) >> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) >> at >> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:= 85) >> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) >> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) >> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) >> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880) >> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870) >> at >> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(Hive= Server.java:198) >> at >> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(Th= riftHive.java:644) >> at >> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(Th= riftHive.java:628) >> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39= ) >> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) >> at >> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPool= Server.java:206) >> at >> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.jav= a:1145) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.ja= va:615) >> at java.lang.Thread.run(Thread.java:745) >> >> >> --ibrar >> > > > --047d7bdc1a52e318340519275812 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi Ibrar
What is the value given for ....
--hiveconf hbase.m=
aster=3D#####
OR 
--hiveconf hbas=
e.zookeeper.quorum=3D########
<=
br>
It seems from the error that HB=
ase Server configuration are not correct.


<= div class=3D"gmail_extra">
Regards
Sanjiv Singh
Mob :=C2=A0 +091 9990-447-= 339

On Wed, Jun 17, 2015 at 4:31 PM, Ibrar Ahmed= <ibrar.ahmad@gmail.com> wrote:
I am able to fix that issue, but got another err= or


[127.0.0.1:10000] hive> CREATE TABLE IF NOT EXISTS pageco= unts_hbase (rowkey STRING, pageviews STRING, bytes STRING) STORED BY 'o= rg.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES (= 'hbase.columns.mapping' =3D ':key,f:c1,f:c2') TBLPROPERTIES= ('hbase.table.na= me' =3D 'pagecounts');
[Hive Error]: Query return= ed non-zero code: 1, cause: FAILED: Execution Error, return code 1 from org= .apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.lang.Illega= lArgumentException: Not a host:port pair: PBUF
"
ibrar-virtual-= machine =EF=BF=BD=EF=BF=BD =EF=BF=BD=DF=AF=EF=BF=BD=EF=BF=BD) =EF=BF=BD= =EF=BF=BD
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.util.Addressing= .parseHostname(Addressing.java:60)
=C2=A0=C2=A0=C2=A0 at org.apache.hado= op.hbase.ServerName.<init>(ServerName.java:96)
=C2=A0=C2=A0=C2=A0 = at org.apache.hadoop.hbase.ServerName.parseVersionedServerName(ServerName.j= ava:278)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.MasterAddressTrac= ker.bytesToServerName(MasterAddressTracker.java:77)
=C2=A0=C2=A0=C2=A0 a= t org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddre= ssTracker.java:61)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.client.= HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.j= ava:631)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.client.HBaseAdmin= .<init>(HBaseAdmin.java:106)

=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HB= aseStorageHandler.java:84)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.= hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162)
= =C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.= createTable(HiveMetaStoreClient.java:554)
=C2=A0=C2=A0=C2=A0 at org.apac= he.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClien= t.java:547)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.i= nvoke0(Native Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAcce= ssorImpl.invoke(NativeMethodAccessorImpl.java:57)
=C2=A0=C2=A0=C2=A0 at = sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp= l.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.= java:606)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.metastore.Retryin= gMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
=C2=A0=C2=A0=C2= =A0 at com.sun.proxy.$Proxy7.createTable(Unknown Source)
=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(= DDLTask.java:4194)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.exec.= DDLTask.execute(DDLTask.java:281)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoo= p.hive.ql.exec.Task.executeTask(Task.java:153)
=C2=A0=C2=A0=C2=A0 at org= .apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.j= ava:1472)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.execute= (Driver.java:1239)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Drive= r.runInternal(Driver.java:1057)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.= hive.ql.Driver.run(Driver.java:880)
=C2=A0=C2=A0=C2=A0 at org.apache.had= oop.hive.ql.Driver.run(Driver.java:870)
=C2=A0=C2=A0=C2=A0 at org.apache= .hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:1= 98)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.service.ThriftHive$Proc= essor$execute.getResult(ThriftHive.java:644)
=C2=A0=C2=A0=C2=A0 at org.a= pache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive= .java:628)
=C2=A0=C2=A0=C2=A0 at org.apache.thrift.ProcessFunction.proce= ss(ProcessFunction.java:39)
=C2=A0=C2=A0=C2=A0 at org.apache.thrift.TBas= eProcessor.process(TBaseProcessor.java:39)
=C2=A0=C2=A0=C2=A0 at org.apa= che.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.jav= a:206)
=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.run= Worker(ThreadPoolExecutor.java:1145)
=C2=A0=C2=A0=C2=A0 at java.util.con= current.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
=C2= =A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)


On Wed, Jun 17, 2015 at 3:51 PM, Ibrar Ahmed <<= a href=3D"mailto:ibrar.ahmad@gmail.com" target=3D"_blank">ibrar.ahmad@gmail= .com> wrote:
Hi,

Whats wrong with my settings?

[127.0.0.1:10000] hive> CREATE TAB= LE IF NOT EXISTS pagecounts_hbase (rowkey STRING, pageviews STRING, bytes S= TRING) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'= WITH SERDEPROPERTIES ('hbase.columns.mapping' =3D ':key,f:c1,f= :c2') TBLPROPERTIES ('hbase.table.name' =3D 'pagecounts');

[Hive = Error]: Query returned non-zero code: 1, cause: FAILED: Execution Error, re= turn code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(= message:MetaException(message:org.apache.hadoop.hbase.MasterNotRunningExcep= tion: Retried 10 times
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.cli= ent.HBaseAdmin.<init>(HBaseAdmin.java:127)
=C2=A0=C2=A0=C2=A0 at o= rg.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageH= andler.java:84)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.hbase.HBase= StorageHandler.preCreateTable(HBaseStorageHandler.java:162)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(= HiveMetaStoreClient.java:554)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hi= ve.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)<= br>=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ= e Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.inv= oke(NativeMethodAccessorImpl.java:57)
=C2=A0=C2=A0=C2=A0 at sun.reflect.= DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:606)=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.metastore.RetryingMetaStoreCl= ient.invoke(RetryingMetaStoreClient.java:89)
=C2=A0=C2=A0=C2=A0 at com.s= un.proxy.$Proxy7.createTable(Unknown Source)
=C2=A0=C2=A0=C2=A0 at org.a= pache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.jav= a:4194)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.exec.DDLTask.exe= cute(DDLTask.java:281)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.e= xec.Task.executeTask(Task.java:153)
=C2=A0=C2=A0=C2=A0 at org.apache.had= oop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472)<= br>=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.execute(Driver.ja= va:1239)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.runInter= nal(Driver.java:1057)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Dr= iver.run(Driver.java:880)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.q= l.Driver.run(Driver.java:870)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hi= ve.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hive.service.ThriftHive$Processor$exec= ute.getResult(ThriftHive.java:644)
=C2=A0=C2=A0=C2=A0 at org.apache.hado= op.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)=
=C2=A0=C2=A0=C2=A0 at org.apache.thrift.ProcessFunction.process(Process= Function.java:39)
=C2=A0=C2=A0=C2=A0 at org.apache.thrift.TBaseProcessor= .process(TBaseProcessor.java:39)
=C2=A0=C2=A0=C2=A0 at org.apache.thrift= .server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
= =C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(Thr= eadPoolExecutor.java:1145)
=C2=A0=C2=A0=C2=A0 at java.util.concurrent.Th= readPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
=C2=A0=C2=A0=C2= =A0 at java.lang.Thread.run(Thread.java:745)
)
=C2=A0=C2=A0=C2=A0 at = org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorage= Handler.java:88)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.hbase.HBas= eStorageHandler.preCreateTable(HBaseStorageHandler.java:162)
=C2=A0=C2= =A0=C2=A0 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTab= le(HiveMetaStoreClient.java:554)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop= .hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:54= 7)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Na= tive Method)
=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.= invoke(NativeMethodAccessorImpl.java:57)
=C2=A0=C2=A0=C2=A0 at sun.refle= ct.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43= )
=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:606)=
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.metastore.RetryingMetaStor= eClient.invoke(RetryingMetaStoreClient.java:89)
=C2=A0=C2=A0=C2=A0 at co= m.sun.proxy.$Proxy7.createTable(Unknown Source)
=C2=A0=C2=A0=C2=A0 at or= g.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.= java:4194)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.exec.DDLTask.= execute(DDLTask.java:281)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.q= l.exec.Task.executeTask(Task.java:153)
=C2=A0=C2=A0=C2=A0 at org.apache.= hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
=C2=A0= =C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:147= 2)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.execute(Driver= .java:1239)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.runIn= ternal(Driver.java:1057)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql= .Driver.run(Driver.java:880)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hiv= e.ql.Driver.run(Driver.java:870)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop= .hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
= =C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.service.ThriftHive$Processor$e= xecute.getResult(ThriftHive.java:644)
=C2=A0=C2=A0=C2=A0 at org.apache.h= adoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:6= 28)
=C2=A0=C2=A0=C2=A0 at org.apache.thrift.ProcessFunction.process(Proc= essFunction.java:39)
=C2=A0=C2=A0=C2=A0 at org.apache.thrift.TBaseProces= sor.process(TBaseProcessor.java:39)
=C2=A0=C2=A0=C2=A0 at org.apache.thr= ift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)<= br>=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(= ThreadPoolExecutor.java:1145)
=C2=A0=C2=A0=C2=A0 at java.util.concurrent= .ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
=C2=A0=C2=A0= =C2=A0 at java.lang.Thread.run(Thread.java:745)


--ibrar
=



--047d7bdc1a52e318340519275812--