Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 48F7310A0F for ; Thu, 26 Dec 2013 16:41:20 +0000 (UTC) Received: (qmail 12427 invoked by uid 500); 26 Dec 2013 16:41:15 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 12003 invoked by uid 500); 26 Dec 2013 16:41:14 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 11995 invoked by uid 99); 26 Dec 2013 16:41:14 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 26 Dec 2013 16:41:14 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of kulkarni.swarnim@gmail.com designates 209.85.216.178 as permitted sender) Received: from [209.85.216.178] (HELO mail-qc0-f178.google.com) (209.85.216.178) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 26 Dec 2013 16:41:10 +0000 Received: by mail-qc0-f178.google.com with SMTP id i17so7701901qcy.37 for ; Thu, 26 Dec 2013 08:40:49 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=uw42wFf3jBIY4iS06+uSoQvNf+0z9J07YF5wg0k0TXI=; b=iCj+yke05iI8CO7dNUkpz7/HqQHQJanPfXtcpo+qz/LFTq2JvO7x1cvk6tZvAGoX+M i8PAShgiQgEawcZhOu5Ia7i7OIbRHAd0b3hPAUrcvQxPAw9oD+lYdt1ZfliY0o5r6/R4 3HMGwgPEV99jvOuGnVpwT8Hgfm0I4AK4qTt2nAch5OSe77fW4mNH/vrQ6oaJbk0FgAMR lSqI4WOidryQut9NkVGkD/Q9uVbOrckzBEphEA0kX2bEt1zdrtjRfqkHuGD4HH2x0eXX GVaIcz8eoox9pt6ZUodZ8c/1saOQaxHVImlJyePxrYW/71u4szqnNjbIKQ0+IhfGIQ9n X+HQ== X-Received: by 10.224.168.212 with SMTP id v20mr23539331qay.62.1388076049185; Thu, 26 Dec 2013 08:40:49 -0800 (PST) MIME-Version: 1.0 Received: by 10.229.87.194 with HTTP; Thu, 26 Dec 2013 08:40:29 -0800 (PST) In-Reply-To: References: From: "kulkarni.swarnim@gmail.com" Date: Thu, 26 Dec 2013 10:40:29 -0600 Message-ID: Subject: Re: hive hbase integration To: user@hive.apache.org Content-Type: multipart/alternative; boundary=089e015370a0d5405f04ee72a31e X-Virus-Checked: Checked by ClamAV on apache.org --089e015370a0d5405f04ee72a31e Content-Type: text/plain; charset=ISO-8859-1 Seems like you are running hive on yarn instead of mr1. I have had some issues in the past doing so. The post here[1] has some solutions on how to configure hive ot work with yarn. Hope that helps. [1] https://groups.google.com/a/cloudera.org/forum/#!topic/cdh-user/gHVq9C5H6RE On Thu, Dec 26, 2013 at 10:35 AM, Vikas Parashar wrote: > Hi, > > I am integrating hive(0.12) with hbase(0.96). Everything is working fine > there but get stuck between two quires. > > When i create table or select * from table then it's working fine . > but in case of select count(*) from table it give me below error. > > > 2013-12-26 13:25:01,864 ERROR ql.Driver > (SessionState.java:printError(419)) - FAILED: Execution Error, return code > 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask > 2013-12-26 13:25:01,869 WARN mapreduce.Counters > (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is > deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead > 2013-12-26 14:25:44,119 WARN mapreduce.JobSubmitter > (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option > parsing not performed. Implement the Tool interface and execute your > application with ToolRunner to remedy this. > 2013-12-26 14:26:14,677 WARN mapreduce.Counters > (AbstractCounters.java:getGroup(234)) - Group > org.apache.hadoop.mapred.Task$Counter is deprecated. Use > org.apache.hadoop.mapreduce.TaskCounter instead > 2013-12-26 14:26:33,613 WARN mapreduce.Counters > (AbstractCounters.java:getGroup(234)) - Group > org.apache.hadoop.mapred.Task$Counter is deprecated. Use > org.apache.hadoop.mapreduce.TaskCounter instead > 2013-12-26 14:27:30,355 WARN mapreduce.Counters > (AbstractCounters.java:getGroup(234)) - Group > org.apache.hadoop.mapred.Task$Counter is deprecated. Use > org.apache.hadoop.mapreduce.TaskCounter instead > 2013-12-26 14:27:32,479 WARN mapreduce.Counters > (AbstractCounters.java:getGroup(234)) - Group > org.apache.hadoop.mapred.Task$Counter is deprecated. Use > org.apache.hadoop.mapreduce.TaskCounter instead > 2013-12-26 14:27:32,528 ERROR exec.Task > (SessionState.java:printError(419)) - Ended Job = job_1388037394132_0013 > with errors > 2013-12-26 14:27:32,530 ERROR exec.Task > (SessionState.java:printError(419)) - Error during job, obtaining debugging > information... > 2013-12-26 14:27:32,538 ERROR exec.Task > (SessionState.java:printError(419)) - Examining task ID: > task_1388037394132_0013_m_000000 (and more) from job job_1388037394132_0013 > 2013-12-26 14:27:32,539 WARN shims.HadoopShimsSecure > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog: > TaskLogServlet is not supported in MR2 mode. > 2013-12-26 14:27:32,593 WARN shims.HadoopShimsSecure > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog: > TaskLogServlet is not supported in MR2 mode. > 2013-12-26 14:27:32,596 WARN shims.HadoopShimsSecure > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog: > TaskLogServlet is not supported in MR2 mode. > 2013-12-26 14:27:32,599 WARN shims.HadoopShimsSecure > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog: > TaskLogServlet is not supported in MR2 mode. > 2013-12-26 14:27:32,615 ERROR exec.Task > (SessionState.java:printError(419)) - > Task with the most failures(4): > ----- > Task ID: > task_1388037394132_0013_m_000000 > > URL: > > http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000 > ----- > Diagnostic Messages for this Task: > Error: java.io.IOException: java.io.IOException: > java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) > at > org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244) > at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:167) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) > Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383) > at > org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360) > at > org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244) > at org.apache.hadoop.hbase.client.HTable.(HTable.java:187) > at org.apache.hadoop.hbase.client.HTable.(HTable.java:164) > at > org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91) > at > org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241) > ... 9 more > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) > at java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381) > ... 15 more > Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace > at > org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196) > at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479) > at > org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) > at > org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83) > at > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794) > at > org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.(HConnectionManager.java:627) > ... 20 more > Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:190) > at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:247) > ... 26 more > > > 2013-12-26 14:27:32,870 ERROR ql.Driver > (SessionState.java:printError(419)) - FAILED: Execution Error, return code > 2 from org.apache. > > I think this error is related with mapred job. Whenever my query use the > map-R then i get error. > > Any idea!! > -- Swarnim --089e015370a0d5405f04ee72a31e Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Seems like you are running hive on yarn instead of mr1. I = have had some issues in the past doing so. The post here[1] has some soluti= ons on how to configure hive ot work with yarn. Hope that helps.


=
On Thu, Dec 26, 2013 at 10:35 AM, Vikas Parashar <para.vikas@gmail.com<= /a>> wrote:
Hi,=A0

I am integrating hive(0.12) with hbase(0.96). Everything is working fine th= ere but get stuck between two quires.=A0

When i create table or select * from table then it's working fine .
but in case of select count(*) from table it give me below error.


2013-12-26 13:25:01,864 ERROR ql.Dr= iver (SessionState.java:printError(419)) - FAILED: Execution Error, return = code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
2013-12-= 26 13:25:01,869 WARN =A0mapreduce.Counters (AbstractCounters.java:getGroup(= 234)) - Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapre= duce.FileSystemCounter instead
2013-12-26 14:25:44,119 WARN =A0mapreduce.JobSubmitter (JobSubmitter.j= ava:copyAndConfigureFiles(149)) - Hadoop command-line option parsing not pe= rformed. Implement the Tool interface and execute your application with Too= lRunner to remedy this.
2013-12-26 14:26:14,677 WARN =A0mapreduce.Counters (AbstractCounters.j= ava:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is depreca= ted. Use org.apache.hadoop.mapreduce.TaskCounter instead
2013-12-= 26 14:26:33,613 WARN =A0mapreduce.Counters (AbstractCounters.java:getGroup(= 234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.= apache.hadoop.mapreduce.TaskCounter instead
2013-12-26 14:27:30,355 WARN =A0mapreduce.Counters (AbstractCounters.j= ava:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is depreca= ted. Use org.apache.hadoop.mapreduce.TaskCounter instead
2013-12-= 26 14:27:32,479 WARN =A0mapreduce.Counters (AbstractCounters.java:getGroup(= 234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.= apache.hadoop.mapreduce.TaskCounter instead
2013-12-26 14:27:32,528 ERROR exec.Task (SessionState.java:printError(= 419)) - Ended Job =3D job_1388037394132_0013 with errors
2013-12-= 26 14:27:32,530 ERROR exec.Task (SessionState.java:printError(419)) - Error= during job, obtaining debugging information...
2013-12-26 14:27:32,538 ERROR exec.Task (SessionState.java:printError(= 419)) - Examining task ID: task_1388037394132_0013_m_000000 (and more) from= job job_1388037394132_0013
2013-12-26 14:27:32,539 WARN =A0shims= .HadoopShimsSecure (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can'= t fetch tasklog: TaskLogServlet is not supported in MR2 mode.
2013-12-26 14:27:32,593 WARN =A0shims.HadoopShimsSecure (Hadoop23Shims= .java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog: TaskLogServlet i= s not supported in MR2 mode.
2013-12-26 14:27:32,596 WARN =A0shim= s.HadoopShimsSecure (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can'= ;t fetch tasklog: TaskLogServlet is not supported in MR2 mode.
2013-12-26 14:27:32,599 WARN =A0shims.HadoopShimsSecure (Hadoop23Shims= .java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog: TaskLogServlet i= s not supported in MR2 mode.
2013-12-26 14:27:32,615 ERROR exec.T= ask (SessionState.java:printError(419)) -=A0
Task with the most failures(4):=A0
-----
Task ID:<= /div>
=A0 task_1388037394132_0013_m_000000

URL= :
=A0=A0http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid= =3Djob_1388037394132_0013&tipid=3Dtask_1388037394132_0013_m_000000<= /div>
-----
Diagnostic Messages for this Task:
Error: ja= va.io.IOException: java.io.IOException: java.lang.reflect.InvocationTargetE= xception
at org.apac= he.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationExc= eption(HiveIOExceptionHandlerChain.java:97)
at org.apache.hadoop.hive= .io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOEx= ceptionHandlerUtil.java:57)
= at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(Hiv= eInputFormat.java:244)
at org.apache.hadoop.hive= .ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:5= 38)
at org.apache.ha= doop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
at org.apache.hadoop.mapr= ed.MapTask.runOldMapper(MapTask.java:408)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:= 341)
at org.apache.hadoop.mapr= ed.YarnChild$2.run(YarnChild.java:162)
at java.security.AccessController.doPrivileged(Native Me= thod)
at javax.security.auth.Su= bject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupI= nformation.java:1491)
at org.apache.hadoop.mapr= ed.YarnChild.main(YarnChild.java:157)
Caused by: java.io.IOExcept= ion: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionM= anager.createConnection(HConnectionManager.java:383)
at org.apache.hadoop.hbas= e.client.HConnectionManager.createConnection(HConnectionManager.java:360)
at org.apache.hadoop.= hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)<= /div>
at org.apache.hadoop.hbas= e.client.HTable.<init>(HTable.java:187)
at org.apache.hadoop.hbase.client.HTable.<init= >(HTable.java:164)
at org.apache.hadoop.hive= .hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.= java:91)
at org.apac= he.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:2= 41)
... 9 more
Caus= ed by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl= .newInstance0(Native Method)
at sun.reflect.NativeCons= tructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.Delegatin= gConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java= :27)
at java.lang.reflect.Cons= tructor.newInstance(Constructor.java:513)
at org.apache.hadoop.hbase.client.HConnectionManager.= createConnection(HConnectionManager.java:381)
... 15 more
Cau= sed by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
at org.apache.hadoop.hbase.zo= okeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
at org.apache.hadoop.hbas= e.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
at org.apache.hadoop.hbase.zookeeper.ZKClusterI= d.readClusterIdZNode(ZKClusterId.java:65)
at org.apache.hadoop.hbas= e.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.c= lient.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConne= ctionManager.java:794)
at org.apache.hadoop.hbas= e.client.HConnectionManager$HConnectionImplementation.<init>(HConnect= ionManager.java:627)
... 20 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace=
at java.net.URLClas= sLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Me= thod)
at java.net.URLClassLoade= r.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppC= lassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247= )
... 26 more

2013-12-26 14:27:32,870 ERROR ql.Driver (Session= State.java:printError(419)) - FAILED: Execution Error, return code 2 from o= rg.apache.

I think this error is related with mapred job. Wh= enever my query use the map-R then i get error.

An= y idea!!



--
Swarnim
--089e015370a0d5405f04ee72a31e--