hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Siva <sbhavan...@gmail.com>
Subject Re: HBase Spark Streaming issue
Date Mon, 21 Sep 2015 22:55:58 GMT
Hi Ted,

Generated of jar file from the same code and submitted to spark-submit in
yarn-cluster mode. It went through fine, and I see kafka data in hbase.

Not sure what is the exact issue with spark-shell. Here is the spark-submit.

spark-submit --class TestHbaseSpark --deploy-mode cluster --master
yarn-cluster --jars
/home/sbhavanari/kafka_2.10-0.8.0.jar,/home/sbhavanari/spark-streaming-kafka_2.10-1.2.0.jar,/usr/hdp/2.2.4.2-2/kafka/libs/metrics-core-2.2.0.jar,/home/sbhavanari/zkclient-0.1.0.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-2.04.jar,/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-hadoop-compat-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hadoop/hadoop-common-2.6.0.2.2.4.2-2.jar
testhbasespark_2.10-0.0.1.jar


Thanks,
Siva.

On Mon, Sep 21, 2015 at 11:58 AM, Siva <sbhavanari@gmail.com> wrote:

> I tried running removing that jar as well, still the same issue. That jar
> contains the connector to Hbase from spark developed by cloudera.
>
> Thanks
>
> On Mon, Sep 21, 2015 at 11:56 AM, Ted Yu <yuzhihong@gmail.com> wrote:
>
>> bq. spark-hbase-0.0.3-clabs-20150225.184251-1.jar
>>
>> The above jar is yours, right ?
>>
>> Can you check its contents ?
>>
>> Thanks
>>
>> On Mon, Sep 21, 2015 at 11:40 AM, Siva <sbhavanari@gmail.com> wrote:
>>
>> > Hi Ted,
>> >
>> > Thanks for your response. I verified all the jars in class path. all of
>> > them on 2.2.4.2-2.
>> >
>> > here are the spark-shell jars i m using. I verified my bash_profile and
>> it
>> > looks good. are there any other places to look at it?
>> >
>> > spark-shell --jars
>> >
>> >
>> /usr/hdp/2.2.4.2-2/hbase/lib/hbase-protocol-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-client-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-common-0.98.4.2.2.4.2-2-hadoop2.jar,/usr/hdp/2.2.4.2-2/hbase/lib/htrace-core-3.0.4.jar,/usr/hdp/2.2.4.2-2/hbase/lib/guava-12.0.1.jar,/usr/hdp/2.2.4.2-2/hbase/lib/hbase-server-0.98.4.2.2.4.2-2-hadoop2.jar,/home/common/spark-1.2.1-bin-hadoop2.4/lib/spark-assembly-1.2.1-hadoop2.4.0.jar,/home/sbhavanari/spark-streaming-kafka_2.10-1.2.0.jar,/home/sbhavanari/kafka_2.10-0.8.0.jar,/home/sbhavanari/spark-hbase-0.0.3-clabs-20150225.184251-1.jar,/home/sbhavanari/zkclient-0.1.0.jar,/usr/hdp/2.2.4.2-2/kafka/libs/metrics-core-2.2.0.jar
>> >
>> > Thanks
>> >
>> >
>> > On Mon, Sep 21, 2015 at 4:30 AM, Ted Yu <yuzhihong@gmail.com> wrote:
>> >
>> > > Dropping dev@
>> > >
>> > > Looks like another version of hbase artifact was in the classpath.
>> > > Can you double check ?
>> > >
>> > > Thanks
>> > >
>> > > > On Sep 21, 2015, at 12:45 AM, Siva <sbhavanari@gmail.com> wrote:
>> > > >
>> > > > Hi,
>> > > >
>> > > > I m seeing some strange error while inserting data from spark
>> streaming
>> > > to
>> > > > hbase.
>> > > >
>> > > > I can able to write the data from spark (without streaming) to hbase
>> > > > successfully, but when i use the same code to write dstream I m
>> seeing
>> > > the
>> > > > below error.
>> > > >
>> > > > I tried setting the below parameters, still didnt help. Did any face
>> > the
>> > > > similar issue?
>> > > >
>> > > > conf.set("hbase.defaults.for.version.skip", "true")
>> > > > conf.set("hbase.defaults.for.version", "0.98.4.2.2.4.2-2-hadoop2")
>> > > >
>> > > > 15/09/20 22:39:10 ERROR Executor: Exception in task 0.0 in stage
>> 14.0
>> > > (TID
>> > > > 16)
>> > > > java.lang.RuntimeException: hbase-default.xml file seems to be for
>> and
>> > > old
>> > > > version of HBase (null), this version is 0.98.4.2.2.4.2-2-hadoop2
>> > > >        at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:73)
>> > > >        at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:105)
>> > > >        at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:116)
>> > > >        at
>> > > >
>> > >
>> >
>> org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:125)
>> > > >        at
>> > > >
>> > >
>> >
>> $line51.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$HBaseConn$.hbaseConnection(<console>:49)
>> > > >        at
>> > > >
>> > >
>> >
>> $line52.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$TestHbaseSpark$$anonfun$run$1$$anonfun$apply$1.apply(<console>:73)
>> > > >        at
>> > > >
>> > >
>> >
>> $line52.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$TestHbaseSpark$$anonfun$run$1$$anonfun$apply$1.apply(<console>:73)
>> > > >        at
>> scala.collection.Iterator$class.foreach(Iterator.scala:727)
>> > > >        at
>> > > >
>> > >
>> >
>> org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
>> > > >        at
>> > > org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:782)
>> > > >        at
>> > > org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:782)
>> > > >        at
>> > > >
>> > >
>> >
>> org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1353)
>> > > >        at
>> > > >
>> > >
>> >
>> org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1353)
>> > > >        at
>> > > > org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
>> > > >        at org.apache.spark.scheduler.Task.run(Task.scala:56)
>> > > >        at
>> > > >
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
>> > > >        at
>> > > >
>> > >
>> >
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> > > >        at
>> > > >
>> > >
>> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> > > >        at java.lang.Thread.run(Thread.java:745)
>> > > > 15/09/20 22:39:10 WARN TaskSetManager: Lost task 0.0 in stage 14.0
>> (TID
>> > > 16,
>> > > > localhost): java.lang.RuntimeException: hbase-default.xml file
>> seems to
>> > > be
>> > > > for and old version of HBase (null), this version is
>> > > > 0.98.4.2.2.4.2-2-hadoop2
>> > > >
>> > > >
>> > > > Thanks,
>> > > > Siva.
>> > >
>> >
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message