kylin-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 吕卓然 <>
Subject 答复: A problem in cube building with SPARK
Date Wed, 17 May 2017 04:17:09 GMT
Hi Shaofeng,

I’ve attached the error log in the attachment..

发件人: ShaoFeng Shi []
发送时间: 2017年5月17日 10:38
收件人: user
主题: Re: A problem in cube building with SPARK

Hi zhuoran, is there any more messages before this error? This error is not the root cause.

2017-05-17 10:27 GMT+08:00 吕卓然 <<>>:
Hi all,

Currently I am using Kylin2.0.0 with CDH 5.8. It works fine when I use MapReduce engine. However,
when I try to use spark engine to build cube, it fails at step 7: Build Cube with Spark. Here
is the log info:

17/05/16 17:50:01 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, fonova-ahz-cdh34):
java.lang.IllegalArgumentException: Failed to find metadata store by url: kylin_metadata@hbase
                    at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(
                    at org.apache.kylin.common.persistence.ResourceStore.getStore(
                    at org.apache.kylin.cube.CubeDescManager.getStore(
                    at org.apache.kylin.cube.CubeDescManager.reloadAllCubeDesc(
                    at org.apache.kylin.cube.CubeDescManager.<init>(
                    at org.apache.kylin.cube.CubeDescManager.getInstance(
                    at org.apache.kylin.cube.CubeInstance.getDescriptor(
                    at org.apache.kylin.cube.CubeSegment.getCubeDesc(
                    at org.apache.kylin.cube.CubeSegment.isEnableSharding(
                    at org.apache.kylin.cube.kv.RowKeyEncoder.<init>(
                    at org.apache.kylin.cube.kv.AbstractRowKeyEncoder.createInstance(
                    at org.apache.kylin.engine.spark.SparkCubingByLayer$
                    at org.apache.kylin.engine.spark.SparkCubingByLayer$
                    at scala.collection.Iterator$$anon$
                    at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)
                    at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:64)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
                    at org.apache.spark.executor.Executor$
                    at java.util.concurrent.ThreadPoolExecutor.runWorker(
                    at java.util.concurrent.ThreadPoolExecutor$

Any suggestions would help.


Best regards,

Shaofeng Shi 史少锋

  • Unnamed multipart/mixed (inline, None, 0 bytes)
View raw message