Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 5A243200B78 for ; Fri, 2 Sep 2016 14:36:23 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 589BB160AAC; Fri, 2 Sep 2016 12:36:23 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 7AB52160AAB for ; Fri, 2 Sep 2016 14:36:22 +0200 (CEST) Received: (qmail 20230 invoked by uid 500); 2 Sep 2016 12:36:21 -0000 Mailing-List: contact issues-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@spark.apache.org Received: (qmail 20221 invoked by uid 99); 2 Sep 2016 12:36:21 -0000 Received: from arcas.apache.org (HELO arcas) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 Sep 2016 12:36:21 +0000 Received: from arcas.apache.org (localhost [127.0.0.1]) by arcas (Postfix) with ESMTP id 95C222C014E for ; Fri, 2 Sep 2016 12:36:21 +0000 (UTC) Date: Fri, 2 Sep 2016 12:36:21 +0000 (UTC) From: "prasannaP (JIRA)" To: issues@spark.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Comment Edited] (SPARK-17373) spark+hive+hbase+hbaseIntegration not working MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Fri, 02 Sep 2016 12:36:23 -0000 [ https://issues.apache.org/jira/browse/SPARK-17373?page=3Dcom.atlassia= n.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D154= 58406#comment-15458406 ]=20 prasannaP edited comment on SPARK-17373 at 9/2/16 12:35 PM: ------------------------------------------------------------ Thanks for your reply.=20 How can i add HBase classes and in which classpath. Can you please sugges= t me how can i query HbaseStorageHandler tables of hive through sparksql. was (Author: prasannapadarthi@gmail.com): Thanks for reply me. How can i add HBase classes and in which classpath. Can you please sugges= t me how can i query HbaseStorageHandler tables of hive through sparksql. > spark+hive+hbase+hbaseIntegration not working > --------------------------------------------- > > Key: SPARK-17373 > URL: https://issues.apache.org/jira/browse/SPARK-17373 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Reporter: prasannaP > Labels: soon > > SparkSQL+Hive+Hbase+HbaseIntegration doesn't work > Hi, > I am getting error when I am trying to connect hive table (which is being > created through HbaseIntegration) in spark > Steps I followed : > *Hive Table creation code *: > CREATE TABLE test.sample(id string,name string)=20 > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPR= OPERTIES ("hbase.columns.mapping" =3D ":key,details:name") > TBLPROPERTIES ("hbase.table.name" =3D "sample"); > *DESCRIBE TEST ;* > col_name data_type comment > id string from deserializer > name string from deserializer > *Starting Spark shell* > spark-shell --master local[2] --driver-class-path /usr/local/hive/lib/hiv= e-hbase-handler-1.2.1.jar: > /usr/local/hbase/lib/hbase-server-0.98.9-hadoop2.jar:/usr/local/hbase/lib= /hbase-protocol-0.98.9-hadoo2.jar: > /usr/local/hbase/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/usr/local/h= base/lib/hbase-hadoop-compat-0.98.9-hadoop2.jar: > /usr/local/hbase/lib/hbase-client-0.98.9-hadoop2.jar:/usr/local/hbase/lib= /hbase-common-0.98.9-hadoop2.jar: > /usr/local/hbase/lib/htrace-core-2.04.jar:/usr/local/hbase/lib/hbase-comm= on-0.98.9-hadoop2-tests.jar: > /usr/local/hbase/lib/hbase-server-0.98.9-hadoop2-tests.jar:/usr/local/hiv= e/lib/zookeeper-3.4.6.jar:/usr/local/hive/lib/guava-14.0.1.jar > In spark-shell: > val sqlContext=3Dnew org.apache.spark.sql.hive.HiveContext(sc) > sqlContext.sql(=E2=80=9Cselect count(*) from test.sample=E2=80=9D).collec= t() > I added this setting in hadoop-env.sh as > export HADOOP_CLASSPATH=3D$HADOOP_CLASSPATH:$HBASE_HOME/lib/* > *Stack Trace* : > Stack SQL context available as sqlContext. > scala> sqlContext.sql("select count(*) from test.sample").collect() > 16/09/02 04:49:28 INFO parse.ParseDriver: Parsing command: select count(*= ) from test.sample > 16/09/02 04:49:35 INFO parse.ParseDriver: Parse Completed > 16/09/02 04:49:40 INFO metastore.HiveMetaStore: 0: get_table : db=3Dtest = tbl=3Dsample > 16/09/02 04:49:40 INFO HiveMetaStore.audit: ugi=3Dhdfs=09ip=3Dunknown-ip-= addr=09cmd=3Dget_table : db=3Dtest tbl=3Dsample=09 > java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes > =09at org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSe= rDe.java:184) > =09at org.apache.hadoop.hive.hbase.HBaseSerDeParameters.(HBaseSerDe= Parameters.java:73) > =09at org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:= 117) > =09at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerD= e.java:53) > =09at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils= .java:521) > =09at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(Met= aStoreUtils.java:391) > =09at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaSto= re(Table.java:276) > =09at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java= :258) > =09at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605) > =09at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOpt= ion$1$$anonfun$3.apply(ClientWrapper.scala:331) > =09at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOpt= ion$1$$anonfun$3.apply(ClientWrapper.scala:326) > =09at scala.Option.map(Option.scala:145) > =09at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOpt= ion$1.apply(ClientWrapper.scala:326) > =09at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOpt= ion$1.apply(ClientWrapper.scala:321) > =09at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveSta= te$1.apply(ClientWrapper.scala:279) > =09at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(Client= Wrapper.scala:226) > =09at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWr= apper.scala:225) > =09at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(Client= Wrapper.scala:268) > =09at org.apache.spark.sql.hive.client.ClientWrapper.getTableOption(Clien= tWrapper.scala:321) > =09at org.apache.spark.sql.hive.client.ClientInterface$class.getTable(Cli= entInterface.scala:122) > =09at org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapp= er.scala:60) > =09at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveM= etastoreCatalog.scala:384) > =09at org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$= catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:4= 57) > =09at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookup= Relation(Catalog.scala:161) > =09at org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation(HiveCo= ntext.scala:457) > =09at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.g= etTable(Analyzer.scala:303) > =20 > Could somebody help me in resolving the error. > Would really appreciate the help . > Thank you. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org