kylin-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Prasanna" <prasann...@trinitymobility.com>
Subject Kylin 2.2.0 is failed at 7th step.
Date Thu, 16 Nov 2017 11:33:09 GMT
Hi all,

 

I installed 2.2.0 by following
http://kylin.apache.org/docs21/tutorial/cube_spark.html
<http://kylin.apache.org/docs21/tutorial/cube_spark.html%20.Kylin>  .Kylin
service is started successfully. I tried to build kylin cube on spark
engine, but its failed at 7th step build cube with spark engine. Please
suggest me how to solve this problem.Therse are my logs. Please suggest me
its high priority for me.

 

 

2017-11-16 16:13:46,345 INFO  [pool-8-thread-1]
threadpool.DefaultScheduler:113 :
CubingJob{id=26342fa2-68ac-48e4-9eea-814206fb79e3, name=BUILD CUBE -
test_sample_cube - 20160101120000_20171114140000 - GMT+08:00 2017-11-15
21:02:27, state=READY} prepare to schedule

2017-11-16 16:13:46,346 INFO  [pool-8-thread-1]
threadpool.DefaultScheduler:116 :
CubingJob{id=26342fa2-68ac-48e4-9eea-814206fb79e3, name=BUILD CUBE -
test_sample_cube - 20160101120000_20171114140000 - GMT+08:00 2017-11-15
21:02:27, state=READY} scheduled

2017-11-16 16:13:46,346 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable:111 :
Executing AbstractExecutable (BUILD CUBE - test_sample_cube -
20160101120000_20171114140000 - GMT+08:00 2017-11-15 21:02:27)

2017-11-16 16:13:46,349 INFO  [pool-8-thread-1]
threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 1 actual
running, 0 stopped, 1 ready, 3 already succeed, 0 error, 1 discarded, 0
others

2017-11-16 16:13:46,360 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 :
job id:26342fa2-68ac-48e4-9eea-814206fb79e3 from READY to RUNNING

2017-11-16 16:13:46,373 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable:111 :
Executing AbstractExecutable (Build Cube with Spark)

2017-11-16 16:13:46,385 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 :
job id:26342fa2-68ac-48e4-9eea-814206fb79e3-06 from READY to RUNNING

2017-11-16 16:13:46,399 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:76 :
SPARK_HOME was set to /usr/local/kylin/spark

2017-11-16 16:13:46,399 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:120 : Using
/usr/local/kylin/hadoop-conf as HADOOP_CONF_DIR

2017-11-16 16:13:46,900 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:162 : Kylin
Config was updated with kylin.metadata.url :
/usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809080231586/meta

2017-11-16 16:13:46,901 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79 :
Using metadata url
/usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809080231586/meta
for resource store

2017-11-16 16:13:47,038 WARN  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] util.HeapMemorySizeUtil:55 :
hbase.regionserver.global.memstore.upperLimit is deprecated by
hbase.regionserver.global.memstore.size

2017-11-16 16:13:47,103 DEBUG [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.JobRelatedMetaUtil:70 :
Dump resources to
/usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809080231586/meta
took 203 ms

2017-11-16 16:13:47,105 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:162 : Kylin
Config was updated with kylin.metadata.url :
/usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809080231586/meta

2017-11-16 16:13:47,105 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79 :
Using metadata url
/usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809080231586/meta
for resource store

2017-11-16 16:13:47,105 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79 :
Using metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:13:47,155 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:76 : hdfs
meta path :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb

2017-11-16 16:13:47,157 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceTool:167 :
Copy from
/usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809080231586/meta
to org.apache.kylin.storage.hdfs.HDFSResourceStore@2f8908ea

2017-11-16 16:13:47,157 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /cube/test_sample_cube.json

2017-11-16 16:13:47,157 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube/test_sample_cube.json

2017-11-16 16:13:47,197 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /cube_desc/test_sample_cube.json

2017-11-16 16:13:47,197 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_desc/test_sample_cube.json

2017-11-16 16:13:47,320 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/cube_statistics/test_sample_cube/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb.seq

2017-11-16 16:13:47,320 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_statistics/test_sample_cube/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc
00dbb.seq

2017-11-16 16:13:48,998 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8ed-e18a5a4e
ba5a.dict

2017-11-16 16:13:48,998 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8
ed-e18a5a4eba5a.dict

2017-11-16 16:13:49,031 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-11
1fe71bd39d.dict

2017-11-16 16:13:49,031 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4
912-8570-111fe71bd39d.dict

2017-11-16 16:13:49,064 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55ee3ecbf.dic
t

2017-11-16 16:13:49,064 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55
ee3ecbf.dict

2017-11-16 16:13:49,097 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c4024726-85bb-484
c-b5ca-4f1c2fb4dec0.dict

2017-11-16 16:13:49,098 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c40247
26-85bb-484c-b5ca-4f1c2fb4dec0.dict

2017-11-16 16:13:49,131 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b72e-696c131
dbe2d.dict

2017-11-16 16:13:49,131 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b
72e-696c131dbe2d.dict

2017-11-16 16:13:49,164 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208
.dict

2017-11-16 16:13:49,164 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9
a43099c1208.dict

2017-11-16 16:13:49,197 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d
.dict

2017-11-16 16:13:49,198 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f
6dc6e6db62d.dict

2017-11-16 16:13:49,231 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92
ad38ef16d0.dict

2017-11-16 16:13:49,231 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-4
9c6-b640-92ad38ef16d0.dict

2017-11-16 16:13:49,306 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641
496.dict

2017-11-16 16:13:49,306 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-874
5-8af522641496.dict

2017-11-16 16:13:49,339 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6-
a2ff093aaf56.dict

2017-11-16 16:13:49,339 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55
-412e-b7d6-a2ff093aaf56.dict

2017-11-16 16:13:49,372 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a
222-954e1c13b537.dict

2017-11-16 16:13:49,373 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-
f62c-4d20-a222-954e1c13b537.dict

2017-11-16 16:13:49,406 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7-6190-4951-
afd7-163137f3988e.dict

2017-11-16 16:13:49,406 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7
-6190-4951-afd7-163137f3988e.dict

2017-11-16 16:13:49,439 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-
93bc6786a0e8.dict

2017-11-16 16:13:49,439 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3
-4537-8198-93bc6786a0e8.dict

2017-11-16 16:13:49,472 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df-
6e7511b0b57a.dict

2017-11-16 16:13:49,473 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092
-4d4c-83df-6e7511b0b57a.dict

2017-11-16 16:13:49,506 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0-4074-9a6
f-05977b6e3260.dict

2017-11-16 16:13:49,506 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7e
c0-4074-9a6f-05977b6e3260.dict

2017-11-16 16:13:49,539 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8
a7a-0e1e4ef7dcd3.dict

2017-11-16 16:13:49,539 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-
42e4-4685-8a7a-0e1e4ef7dcd3.dict

2017-11-16 16:13:49,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58-19b3-46cb-
bcf7-64d1b7b10fe0.dict

2017-11-16 16:13:49,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58
-19b3-46cb-bcf7-64d1b7b10fe0.dict

2017-11-16 16:13:49,606 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9-4d28-81d
c-fdac31787942.dict

2017-11-16 16:13:49,606 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2e
c9-4d28-81dc-fdac31787942.dict

2017-11-16 16:13:49,639 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933-8229-46c0-
bb82-fd0cf01bede5.dict

2017-11-16 16:13:49,639 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933
-8229-46c0-bb82-fd0cf01bede5.dict

2017-11-16 16:13:49,672 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46-48f6-8531-
47afa69bea83.dict

2017-11-16 16:13:49,673 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46
-48f6-8531-47afa69bea83.dict

2017-11-16 16:13:49,706 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386-68e1-4e
82-8852-ab9bf2a6a114.dict

2017-11-16 16:13:49,706 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477
386-68e1-4e82-8852-ab9bf2a6a114.dict

2017-11-16 16:13:49,739 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-a2f9-403c-9
b0b-64a7b03f8f84.dict

2017-11-16 16:13:49,739 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-
a2f9-403c-9b0b-64a7b03f8f84.dict

2017-11-16 16:13:49,772 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-962
2-aeaf4fe878b6.dict

2017-11-16 16:13:49,773 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce
7f-4553-9622-aeaf4fe878b6.dict

2017-11-16 16:13:49,806 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db
12084dea3.dict

2017-11-16 16:13:49,806 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4f
c1-9943-9db12084dea3.dict

2017-11-16 16:13:49,839 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7b
c1693c3f09.dict

2017-11-16 16:13:49,839 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4
421-b390-7bc1693c3f09.dict

2017-11-16 16:13:49,872 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677f-29b9-4952
-a7a5-c0119e3da826.dict

2017-11-16 16:13:49,873 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677
f-29b9-4952-a7a5-c0119e3da826.dict

2017-11-16 16:13:49,906 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5
a-ab01-6e5ac7152672.dict

2017-11-16 16:13:49,906 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d
33-8260-4f5a-ab01-6e5ac7152672.dict

2017-11-16 16:13:49,939 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path :
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223
ee929068d.dict

2017-11-16 16:13:49,939 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44
d4-a85e-223ee929068d.dict

2017-11-16 16:13:49,972 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /kylin.properties

2017-11-16 16:13:49,973 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/kylin.properties

2017-11-16 16:13:50,006 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /model_desc/test_sample_model.json

2017-11-16 16:13:50,006 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/model_desc/test_sample_model.json

2017-11-16 16:13:50,039 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /project/test_sample.json

2017-11-16 16:13:50,039 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/project/test_sample.json

2017-11-16 16:13:50,072 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /table/TRINITYICCC.INDEX_EVENT.json

2017-11-16 16:13:50,073 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/table/TRINITYICCC.INDEX_EVENT.json

2017-11-16 16:13:50,139 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /table/TRINITYICCC.PINMAPPING_FACT.json

2017-11-16 16:13:50,139 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/table/TRINITYICCC.PINMAPPING_FACT.json

2017-11-16 16:13:50,181 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : res
path : /table/TRINITYICCC.V_ANALYST_INCIDENTS.json

2017-11-16 16:13:50,181 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : put
resource :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/table/TRINITYICCC.V_ANALYST_INCIDENTS.json

2017-11-16 16:13:50,214 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:76 :
SPARK_HOME was set to /usr/local/kylin/spark

2017-11-16 16:13:50,215 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:149 : cmd:
export HADOOP_CONF_DIR=/usr/local/kylin/hadoop-conf &&
/usr/local/kylin/spark/bin/spark-submit --class
org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1
--conf spark.yarn.archive=hdfs://trinitybdhdfs/kylin/spark/spark-libs.jar
--conf spark.yarn.queue=default  --conf
spark.yarn.am.extraJavaOptions=-Dhdp.version=2.4.3.0-227  --conf
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf
spark.driver.extraJavaOptions=-Dhdp.version=2.4.3.0-227  --conf
spark.master=local[*]  --conf
spark.executor.extraJavaOptions=-Dhdp.version=2.4.3.0-227  --conf
spark.hadoop.yarn.timeline-service.enabled=false  --conf
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf
spark.executor.cores=2 --jars
/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4
.3.0-227/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.3.0-227/hbase/lib/gua
va-12.0.1.jar, /usr/local/kylin/lib/kylin-job-2.2.0.jar -className
org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb -output
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/ -segmentId
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

2017-11-16 16:13:51,639 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
SparkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLayer
-hiveTable
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb -output
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/ -segmentId
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

2017-11-16 16:13:51,649 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
Abstract Application args:-hiveTable
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb -output
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/ -segmentId
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

2017-11-16 16:13:51,725 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:51 INFO spark.SparkContext: Running Spark version 2.2.0

2017-11-16 16:13:52,221 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkContext: Submitted application: Cubing
for:test_sample_cube segment d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:52,247 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls to: hdfs

2017-11-16 16:13:52,248 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls to: hdfs

2017-11-16 16:13:52,248 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls groups to: 

2017-11-16 16:13:52,249 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls groups
to: 

2017-11-16 16:13:52,249 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users  with view permissions:
Set(hdfs); groups with view permissions: Set(); users  with modify
permissions: Set(hdfs); groups with modify permissions: Set()

2017-11-16 16:13:52,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO util.Utils: Successfully started service
'sparkDriver' on port 42799.

2017-11-16 16:13:52,593 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkEnv: Registering MapOutputTracker

2017-11-16 16:13:52,613 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkEnv: Registering BlockManagerMaster

2017-11-16 16:13:52,616 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information

2017-11-16 16:13:52,617 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up

2017-11-16 16:13:52,634 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO storage.DiskBlockManager: Created local directory at
/tmp/blockmgr-b8d6ec0d-8a73-4ce6-9dbf-64002d5e2a62

2017-11-16 16:13:52,655 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO memory.MemoryStore: MemoryStore started with capacity
366.3 MB

2017-11-16 16:13:52,712 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkEnv: Registering OutputCommitCoordinator

2017-11-16 16:13:52,790 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO util.log: Logging initialized @2149ms

2017-11-16 16:13:52,859 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO server.Server: jetty-9.3.z-SNAPSHOT

2017-11-16 16:13:52,875 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO server.Server: Started @2235ms

2017-11-16 16:13:52,896 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO server.AbstractConnector: Started
ServerConnector@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

2017-11-16 16:13:52,897 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO util.Utils: Successfully started service 'SparkUI' on
port 4040.

2017-11-16 16:13:52,923 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@f5c79a6{/jobs,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,923 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1fc793c2{/jobs/json,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,924 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@329a1243{/jobs/job,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,925 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@27f9e982{/jobs/job/json,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,926 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@37d3d232{/stages,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,926 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@581d969c{/stages/json,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,927 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2b46a8c1{/stages/stage,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,928 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5851bd4f{/stages/stage/json,null,AVAILABLE,@Sp
ark}

2017-11-16 16:13:52,929 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2f40a43{/stages/pool,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,930 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@69c43e48{/stages/pool/json,null,AVAILABLE,@Spa
rk}

2017-11-16 16:13:52,930 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@3a80515c{/storage,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,931 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1c807b1d{/storage/json,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,932 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1b39fd82{/storage/rdd,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,933 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@21680803{/storage/rdd/json,null,AVAILABLE,@Spa
rk}

2017-11-16 16:13:52,933 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@c8b96ec{/environment,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,934 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2d8f2f3a{/environment/json,null,AVAILABLE,@Spa
rk}

2017-11-16 16:13:52,935 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@7048f722{/executors,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,936 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@58a55449{/executors/json,null,AVAILABLE,@Spark
}

2017-11-16 16:13:52,936 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@6e0ff644{/executors/threadDump,null,AVAILABLE,
@Spark}

2017-11-16 16:13:52,937 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2a2bb0eb{/executors/threadDump/json,null,AVAIL
ABLE,@Spark}

2017-11-16 16:13:52,945 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2d0566ba{/static,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,946 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@29d2d081{/,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,948 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@58783f6c{/api,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,948 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@88d6f9b{/jobs/job/kill,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,949 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@475b7792{/stages/stage/kill,null,AVAILABLE,@Sp
ark}

2017-11-16 16:13:52,952 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at
http://192.168.1.135:4040

2017-11-16 16:13:52,977 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.jar at
spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with
timestamp 1510829032976

2017-11-16 16:13:52,977 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/hdp/2.4.3.0-227/hbase/lib/metrics-core-2.2.0.jar at
spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp
1510829032977

2017-11-16 16:13:52,977 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/hdp/2.4.3.0-227/hbase/lib/guava-12.0.1.jar at
spark://192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp
1510829032977

2017-11-16 16:13:52,978 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/local/kylin/lib/kylin-job-2.2.0.jar at
spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp
1510829032978

2017-11-16 16:13:53,042 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO executor.Executor: Starting executor ID driver on
host localhost

2017-11-16 16:13:53,061 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 33164.

2017-11-16 16:13:53,066 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO netty.NettyBlockTransferService: Server created on
192.168.1.135:33164

2017-11-16 16:13:53,068 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO storage.BlockManager: Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy

2017-11-16 16:13:53,070 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registering BlockManager
BlockManagerId(driver, 192.168.1.135, 33164, None)

2017-11-16 16:13:53,073 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO storage.BlockManagerMasterEndpoint: Registering block
manager 192.168.1.135:33164 with 366.3 MB RAM, BlockManagerId(driver,
192.168.1.135, 33164, None)

2017-11-16 16:13:53,076 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registered BlockManager
BlockManagerId(driver, 192.168.1.135, 33164, None)

2017-11-16 16:13:53,076 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO storage.BlockManager: Initialized BlockManager:
BlockManagerId(driver, 192.168.1.135, 33164, None)

2017-11-16 16:13:53,225 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:53 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1b9c1b51{/metrics/json,null,AVAILABLE,@Spark}

2017-11-16 16:13:54,057 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO scheduler.EventLoggingListener: Logging events to
hdfs:///kylin/spark-history/local-1510829033012

2017-11-16 16:13:54,093 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO common.AbstractHadoopJob: Ready to load KylinConfig
from uri:
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,250 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO cube.CubeManager: Initializing CubeManager with
config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,254 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO persistence.ResourceStore: Using metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:13:54,282 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO hdfs.HDFSResourceStore: hdfs meta path :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb

2017-11-16 16:13:54,295 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO cube.CubeManager: Loading Cube from folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube

2017-11-16 16:13:54,640 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO cube.CubeDescManager: Initializing CubeDescManager
with config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,641 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO cube.CubeDescManager: Reloading Cube Metadata from
folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_desc

2017-11-16 16:13:54,705 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO project.ProjectManager: Initializing ProjectManager
with metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,761 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO measure.MeasureTypeFactory: Checking custom measure
types from kylin config

2017-11-16 16:13:54,762 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
COUNT_DISTINCT(hllc), class
org.apache.kylin.measure.hllc.HLLCMeasureType$Factory

2017-11-16 16:13:54,768 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
COUNT_DISTINCT(bitmap), class
org.apache.kylin.measure.bitmap.BitmapMeasureType$Factory

2017-11-16 16:13:54,774 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering TOP_N(topn),
class org.apache.kylin.measure.topn.TopNMeasureType$Factory

2017-11-16 16:13:54,776 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering RAW(raw),
class org.apache.kylin.measure.raw.RawMeasureType$Factory

2017-11-16 16:13:54,778 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
EXTENDED_COLUMN(extendedcolumn), class
org.apache.kylin.measure.extendedcolumn.ExtendedColumnMeasureType$Factory

2017-11-16 16:13:54,780 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
PERCENTILE(percentile), class
org.apache.kylin.measure.percentile.PercentileMeasureType$Factory

2017-11-16 16:13:54,800 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO metadata.MetadataManager: Reloading data model at
/model_desc/test_sample_model.json

2017-11-16 16:13:54,931 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO cube.CubeDescManager: Loaded 1 Cube(s)

2017-11-16 16:13:54,932 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO cube.CubeManager: Reloaded cube test_sample_cube
being CUBE[name=test_sample_cube] having 1 segments

2017-11-16 16:13:54,932 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes

2017-11-16 16:13:54,942 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:54 INFO spark.SparkCubingByLayer: RDD Output path:
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/

2017-11-16 16:13:55,758 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO spark.SparkCubingByLayer: All measure are normal (agg
on all cuboids) ? : true

2017-11-16 16:13:55,868 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO internal.SharedState: loading hive config file:
file:/usr/local/spark/conf/hive-site.xml

2017-11-16 16:13:55,888 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO internal.SharedState: spark.sql.warehouse.dir is not
set, but hive.metastore.warehouse.dir is set. Setting
spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir
('/apps/hive/warehouse').

2017-11-16 16:13:55,889 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO internal.SharedState: Warehouse path is
'/apps/hive/warehouse'.

2017-11-16 16:13:55,895 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@75cf0de5{/SQL,null,AVAILABLE,@Spark}

2017-11-16 16:13:55,895 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@468173fa{/SQL/json,null,AVAILABLE,@Spark}

2017-11-16 16:13:55,896 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@27e2287c{/SQL/execution,null,AVAILABLE,@Spark}

2017-11-16 16:13:55,896 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2cd388f5{/SQL/execution/json,null,AVAILABLE,@S
park}

2017-11-16 16:13:55,898 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@4207852d{/static/sql,null,AVAILABLE,@Spark}

2017-11-16 16:13:56,397 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:56 INFO hive.HiveUtils: Initializing HiveMetastoreConnection
version 1.2.1 using Spark classes.

2017-11-16 16:13:57,548 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:57,583 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:57,709 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO session.SessionState: Created local directory:
/tmp/58660d8b-48ac-4cf0-bd06-6b96018a5482_resources

2017-11-16 16:13:57,732 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482

2017-11-16 16:13:57,734 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO session.SessionState: Created local directory:
/tmp/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482

2017-11-16 16:13:57,738 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482/_tmp_space.db

2017-11-16 16:13:57,740 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO client.HiveClientImpl: Warehouse location for Hive
client (version 1.2.1) is /apps/hive/warehouse

2017-11-16 16:13:57,751 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO sqlstd.SQLStdHiveAccessController: Created
SQLStdHiveAccessController for session context : HiveAuthzSessionContext
[sessionString=58660d8b-48ac-4cf0-bd06-6b96018a5482, clientType=HIVECLI]

2017-11-16 16:13:57,752 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO hive.metastore: Mestastore configuration
hive.metastore.filter.hook changed from
org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaSto
reFilterHook

2017-11-16 16:13:57,756 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:57,757 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,073 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration
hive.metastore.filter.hook changed from
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaSto
reFilterHook to
org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl

2017-11-16 16:13:58,073 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:58,075 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,078 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO session.SessionState: Created local directory:
/tmp/bd69eb21-01c1-4dd3-b31c-16e065ab4101_resources

2017-11-16 16:13:58,088 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101

2017-11-16 16:13:58,089 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO session.SessionState: Created local directory:
/tmp/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101

2017-11-16 16:13:58,096 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101/_tmp_space.db

2017-11-16 16:13:58,097 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO client.HiveClientImpl: Warehouse location for Hive
client (version 1.2.1) is /apps/hive/warehouse

2017-11-16 16:13:58,098 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO sqlstd.SQLStdHiveAccessController: Created
SQLStdHiveAccessController for session context : HiveAuthzSessionContext
[sessionString=bd69eb21-01c1-4dd3-b31c-16e065ab4101, clientType=HIVECLI]

2017-11-16 16:13:58,098 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration
hive.metastore.filter.hook changed from
org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaSto
reFilterHook

2017-11-16 16:13:58,098 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:58,100 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,139 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO state.StateStoreCoordinatorRef: Registered
StateStoreCoordinator endpoint

2017-11-16 16:13:58,143 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO execution.SparkSqlParser: Parsing command:
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb

2017-11-16 16:13:58,292 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:58,294 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,345 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,355 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,355 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: timestamp

2017-11-16 16:13:58,357 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,357 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,358 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,358 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double

2017-11-16 16:13:58,358 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double

2017-11-16 16:13:58,359 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,359 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,359 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,360 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,360 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,360 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,361 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,361 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,361 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,362 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,362 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,362 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: boolean

2017-11-16 16:13:58,363 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,363 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,363 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,363 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:13:58,364 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

2017-11-16 16:14:00,368 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0 stored as
values in memory (estimated size 373.5 KB, free 365.9 MB)

2017-11-16 16:14:00,685 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0_piece0 stored
as bytes in memory (estimated size 35.8 KB, free 365.9 MB)

2017-11-16 16:14:00,688 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:00 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in
memory on 192.168.1.135:33164 (size: 35.8 KB, free: 366.3 MB)

2017-11-16 16:14:00,691 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:00 INFO spark.SparkContext: Created broadcast 0 from 

2017-11-16 16:14:01,094 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46-48f6-8531-
47afa69bea83.dict

2017-11-16 16:14:01,106 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-962
2-aeaf4fe878b6.dict

2017-11-16 16:14:01,111 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7-6190-4951-
afd7-163137f3988e.dict

2017-11-16 16:14:01,115 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0-4074-9a6
f-05977b6e3260.dict

2017-11-16 16:14:01,119 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8
a7a-0e1e4ef7dcd3.dict

2017-11-16 16:14:01,122 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58-19b3-46cb-
bcf7-64d1b7b10fe0.dict

2017-11-16 16:14:01,127 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9-4d28-81d
c-fdac31787942.dict

2017-11-16 16:14:01,131 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a
222-954e1c13b537.dict

2017-11-16 16:14:01,135 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386-68e1-4e
82-8852-ab9bf2a6a114.dict

2017-11-16 16:14:01,139 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db
12084dea3.dict

2017-11-16 16:14:01,142 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7b
c1693c3f09.dict

2017-11-16 16:14:01,146 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933-8229-46c0-
bb82-fd0cf01bede5.dict

2017-11-16 16:14:01,149 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-a2f9-403c-9
b0b-64a7b03f8f84.dict

2017-11-16 16:14:01,152 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5
a-ab01-6e5ac7152672.dict

2017-11-16 16:14:01,156 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223
ee929068d.dict

2017-11-16 16:14:01,159 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677f-29b9-4952
-a7a5-c0119e3da826.dict

2017-11-16 16:14:01,164 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-
93bc6786a0e8.dict

2017-11-16 16:14:01,167 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df-
6e7511b0b57a.dict

2017-11-16 16:14:01,171 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6-
a2ff093aaf56.dict

2017-11-16 16:14:01,174 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641
496.dict

2017-11-16 16:14:01,178 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8ed-e18a5a4e
ba5a.dict

2017-11-16 16:14:01,181 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-11
1fe71bd39d.dict

2017-11-16 16:14:01,184 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d
.dict

2017-11-16 16:14:01,188 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92
ad38ef16d0.dict

2017-11-16 16:14:01,191 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c4024726-85bb-484
c-b5ca-4f1c2fb4dec0.dict

2017-11-16 16:14:01,195 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55ee3ecbf.dic
t

2017-11-16 16:14:01,199 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208
.dict

2017-11-16 16:14:01,202 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b72e-696c131
dbe2d.dict

2017-11-16 16:14:01,261 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO common.CubeStatsReader: Estimating size for layer 0,
all cuboids are 536870911, total size is 0.010198831558227539

2017-11-16 16:14:01,261 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO spark.SparkCubingByLayer: Partition for spark cubing:
1

2017-11-16 16:14:01,353 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO output.FileOutputCommitter: File Output Committer
Algorithm version is 1

2017-11-16 16:14:01,422 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO spark.SparkContext: Starting job: runJob at
SparkHadoopMapReduceWriter.scala:88

2017-11-16 16:14:01,543 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO mapred.FileInputFormat: Total input paths to process
: 1

2017-11-16 16:14:01,623 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.DAGScheduler: Registering RDD 6 (mapToPair
at SparkCubingByLayer.java:170)

2017-11-16 16:14:01,627 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.DAGScheduler: Got job 0 (runJob at
SparkHadoopMapReduceWriter.scala:88) with 1 output partitions

2017-11-16 16:14:01,628 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.DAGScheduler: Final stage: ResultStage 1
(runJob at SparkHadoopMapReduceWriter.scala:88)

2017-11-16 16:14:01,629 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.DAGScheduler: Parents of final stage:
List(ShuffleMapStage 0)

2017-11-16 16:14:01,638 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.DAGScheduler: Missing parents:
List(ShuffleMapStage 0)

2017-11-16 16:14:01,652 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0
(MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170), which has
no missing parents

2017-11-16 16:14:01,855 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1 stored as
values in memory (estimated size 25.8 KB, free 365.9 MB)

2017-11-16 16:14:01,892 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1_piece0 stored
as bytes in memory (estimated size 10.7 KB, free 365.9 MB)

2017-11-16 16:14:01,894 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in
memory on 192.168.1.135:33164 (size: 10.7 KB, free: 366.3 MB)

2017-11-16 16:14:01,896 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO spark.SparkContext: Created broadcast 1 from
broadcast at DAGScheduler.scala:1006

2017-11-16 16:14:01,922 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
from ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at
SparkCubingByLayer.java:170) (first 15 tasks are for partitions Vector(0))

2017-11-16 16:14:01,924 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:01 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with
1 tasks

2017-11-16 16:14:02,015 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO scheduler.TaskSetManager: Starting task 0.0 in stage
0.0 (TID 0, localhost, executor driver, partition 0, ANY, 4978 bytes)

2017-11-16 16:14:02,033 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID
0)

2017-11-16 16:14:02,044 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp
1510829032977

2017-11-16 16:14:02,159 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO client.TransportClientFactory: Successfully created
connection to /192.168.1.135:42799 after 64 ms (0 ms spent in bootstraps)

2017-11-16 16:14:02,179 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp5518529699147501519.tmp

2017-11-16 16:14:02,259 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/metrics-core-2.2.0.jar to class loader

2017-11-16 16:14:02,260 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp
1510829032977

2017-11-16 16:14:02,261 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/guava-12.0.1.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp2368368452706093062.tmp

2017-11-16 16:14:02,278 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/guava-12.0.1.jar to class loader

2017-11-16 16:14:02,278 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with
timestamp 1510829032976

2017-11-16 16:14:02,279 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp4539374910339958167.tmp

2017-11-16 16:14:02,295 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/htrace-core-3.1.0-incubating.jar to class loader

2017-11-16 16:14:02,295 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp
1510829032978

2017-11-16 16:14:02,296 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp9086394010889635270.tmp

2017-11-16 16:14:02,418 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/kylin-job-2.2.0.jar to class loader

2017-11-16 16:14:02,540 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO rdd.HadoopRDD: Input split:
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc
00dbb/000000_0:0+19534

2017-11-16 16:14:02,569 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO zlib.ZlibFactory: Successfully loaded & initialized
native-zlib library

2017-11-16 16:14:02,570 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

2017-11-16 16:14:02,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

2017-11-16 16:14:02,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

2017-11-16 16:14:02,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

2017-11-16 16:14:03,035 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 251.01178 ms

2017-11-16 16:14:03,106 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 55.530064 ms

2017-11-16 16:14:03,148 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO common.AbstractHadoopJob: Ready to load KylinConfig
from uri:
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,170 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO cube.CubeManager: Initializing CubeManager with
config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,170 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO persistence.ResourceStore: Using metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:14:03,190 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO hdfs.HDFSResourceStore: hdfs meta path :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb

2017-11-16 16:14:03,194 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO cube.CubeManager: Loading Cube from folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube

2017-11-16 16:14:03,198 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO cube.CubeDescManager: Initializing CubeDescManager
with config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,198 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO cube.CubeDescManager: Reloading Cube Metadata from
folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_desc

2017-11-16 16:14:03,206 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO project.ProjectManager: Initializing ProjectManager
with metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,213 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN cachesync.Broadcaster: More than one singleton exist

2017-11-16 16:14:03,213 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN project.ProjectManager: More than one singleton exist

2017-11-16 16:14:03,232 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO metadata.MetadataManager: Reloading data model at
/model_desc/test_sample_model.json

2017-11-16 16:14:03,237 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN metadata.MetadataManager: More than one singleton
exist, current keys: 1464031233,1545268424

2017-11-16 16:14:03,239 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO cube.CubeDescManager: Loaded 1 Cube(s)

2017-11-16 16:14:03,239 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN cube.CubeDescManager: More than one singleton exist

2017-11-16 16:14:03,239 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO cube.CubeManager: Reloaded cube test_sample_cube
being CUBE[name=test_sample_cube] having 1 segments

2017-11-16 16:14:03,239 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes

2017-11-16 16:14:03,239 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN cube.CubeManager: More than one singleton exist

2017-11-16 16:14:03,239 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN cube.CubeManager: type: class
org.apache.kylin.common.KylinConfig reference: 1464031233

2017-11-16 16:14:03,239 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN cube.CubeManager: type: class
org.apache.kylin.common.KylinConfig reference: 1545268424

2017-11-16 16:14:03,283 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 WARN dict.DictionaryManager: More than one singleton exist

2017-11-16 16:14:03,283 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46-48f6-8531-
47afa69bea83.dict

2017-11-16 16:14:03,287 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-962
2-aeaf4fe878b6.dict

2017-11-16 16:14:03,290 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7-6190-4951-
afd7-163137f3988e.dict

2017-11-16 16:14:03,294 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0-4074-9a6
f-05977b6e3260.dict

2017-11-16 16:14:03,297 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8
a7a-0e1e4ef7dcd3.dict

2017-11-16 16:14:03,300 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58-19b3-46cb-
bcf7-64d1b7b10fe0.dict

2017-11-16 16:14:03,303 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9-4d28-81d
c-fdac31787942.dict

2017-11-16 16:14:03,309 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a
222-954e1c13b537.dict

2017-11-16 16:14:03,315 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386-68e1-4e
82-8852-ab9bf2a6a114.dict

2017-11-16 16:14:03,321 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db
12084dea3.dict

2017-11-16 16:14:03,329 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7b
c1693c3f09.dict

2017-11-16 16:14:03,335 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933-8229-46c0-
bb82-fd0cf01bede5.dict

2017-11-16 16:14:03,340 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-a2f9-403c-9
b0b-64a7b03f8f84.dict

2017-11-16 16:14:03,345 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5
a-ab01-6e5ac7152672.dict

2017-11-16 16:14:03,351 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223
ee929068d.dict

2017-11-16 16:14:03,357 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677f-29b9-4952
-a7a5-c0119e3da826.dict

2017-11-16 16:14:03,363 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-
93bc6786a0e8.dict

2017-11-16 16:14:03,369 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df-
6e7511b0b57a.dict

2017-11-16 16:14:03,375 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6-
a2ff093aaf56.dict

2017-11-16 16:14:03,381 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641
496.dict

2017-11-16 16:14:03,386 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8ed-e18a5a4e
ba5a.dict

2017-11-16 16:14:03,391 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-11
1fe71bd39d.dict

2017-11-16 16:14:03,397 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d
.dict

2017-11-16 16:14:03,402 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92
ad38ef16d0.dict

2017-11-16 16:14:03,408 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c4024726-85bb-484
c-b5ca-4f1c2fb4dec0.dict

2017-11-16 16:14:03,414 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55ee3ecbf.dic
t

2017-11-16 16:14:03,420 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208
.dict

2017-11-16 16:14:03,425 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b72e-696c131
dbe2d.dict

2017-11-16 16:14:04,031 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO executor.Executor: Finished task 0.0 in stage 0.0
(TID 0). 1347 bytes result sent to driver

2017-11-16 16:14:04,072 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.TaskSetManager: Finished task 0.0 in stage
0.0 (TID 0) in 2084 ms on localhost (executor driver) (1/1)

2017-11-16 16:14:04,075 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0,
whose tasks have all completed, from pool 

2017-11-16 16:14:04,082 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (mapToPair
at SparkCubingByLayer.java:170) finished in 2.118 s

2017-11-16 16:14:04,082 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: looking for newly runnable
stages

2017-11-16 16:14:04,083 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: running: Set()

2017-11-16 16:14:04,083 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)

2017-11-16 16:14:04,084 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: failed: Set()

2017-11-16 16:14:04,088 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting ResultStage 1
(MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:238), which has
no missing parents

2017-11-16 16:14:04,134 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2 stored as
values in memory (estimated size 82.3 KB, free 365.8 MB)

2017-11-16 16:14:04,153 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2_piece0 stored
as bytes in memory (estimated size 31.6 KB, free 365.8 MB)

2017-11-16 16:14:04,154 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in
memory on 192.168.1.135:33164 (size: 31.6 KB, free: 366.2 MB)

2017-11-16 16:14:04,155 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO spark.SparkContext: Created broadcast 2 from
broadcast at DAGScheduler.scala:1006

2017-11-16 16:14:04,158 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
from ResultStage 1 (MapPartitionsRDD[8] at mapToPair at
SparkCubingByLayer.java:238) (first 15 tasks are for partitions Vector(0))

2017-11-16 16:14:04,158 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with
1 tasks

2017-11-16 16:14:04,160 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.TaskSetManager: Starting task 0.0 in stage
1.0 (TID 1, localhost, executor driver, partition 0, ANY, 4621 bytes)

2017-11-16 16:14:04,160 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID
1)

2017-11-16 16:14:04,204 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Getting 1
non-empty blocks out of 1 blocks

2017-11-16 16:14:04,206 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote
fetches in 6 ms

2017-11-16 16:14:04,315 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO memory.MemoryStore: Block rdd_7_0 stored as bytes in
memory (estimated size 49.2 KB, free 365.7 MB)

2017-11-16 16:14:04,315 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added rdd_7_0 in memory on
192.168.1.135:33164 (size: 49.2 KB, free: 366.2 MB)

2017-11-16 16:14:04,331 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer
Algorithm version is 1

2017-11-16 16:14:04,334 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer
Algorithm version is 1

2017-11-16 16:14:04,359 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO common.AbstractHadoopJob: Ready to load KylinConfig
from uri:
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:04,377 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO cube.CubeDescManager: Initializing CubeDescManager
with config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:04,377 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO persistence.ResourceStore: Using metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:14:04,393 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO hdfs.HDFSResourceStore: hdfs meta path :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb

2017-11-16 16:14:04,394 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO cube.CubeDescManager: Reloading Cube Metadata from
folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_desc

2017-11-16 16:14:04,400 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO project.ProjectManager: Initializing ProjectManager
with metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:04,406 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 WARN cachesync.Broadcaster: More than one singleton exist

2017-11-16 16:14:04,406 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 WARN project.ProjectManager: More than one singleton exist

2017-11-16 16:14:04,423 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO metadata.MetadataManager: Reloading data model at
/model_desc/test_sample_model.json

2017-11-16 16:14:04,427 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 WARN metadata.MetadataManager: More than one singleton
exist, current keys: 1464031233,1545268424,1474775600

2017-11-16 16:14:04,428 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO cube.CubeDescManager: Loaded 1 Cube(s)

2017-11-16 16:14:04,428 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 WARN cube.CubeDescManager: More than one singleton exist

2017-11-16 16:14:04,498 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO output.FileOutputCommitter: Saved output of task
'attempt_20171116161401_0001_r_000000_0' to
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/level_base_cuboid/_temporary/0/task_2017111
6161401_0001_r_000000

2017-11-16 16:14:04,499 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO mapred.SparkHadoopMapRedUtil:
attempt_20171116161401_0001_r_000000_0: Committed

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 ERROR executor.Executor: Exception in task 0.0 in stage
1.0 (TID 1)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Note:
To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

2017-11-16 16:14:04,517 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,543 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 1.0
(TID 1, localhost, executor driver): java.lang.IllegalArgumentException:
Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Note:
To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,544 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 

2017-11-16 16:14:04,546 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 ERROR scheduler.TaskSetManager: Task 0 in stage 1.0 failed
1 times; aborting job

2017-11-16 16:14:04,547 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0,
whose tasks have all completed, from pool 

2017-11-16 16:14:04,551 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1

2017-11-16 16:14:04,552 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at
SparkHadoopMapReduceWriter.scala:88) failed in 0.393 s due to Job aborted
due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent
failure: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver):
java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,552 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Note:
To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

2017-11-16 16:14:04,552 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

2017-11-16 16:14:04,552 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

2017-11-16 16:14:04,552 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 

2017-11-16 16:14:04,553 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Driver
stacktrace:

2017-11-16 16:14:04,557 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO scheduler.DAGScheduler: Job 0 failed: runJob at
SparkHadoopMapReduceWriter.scala:88, took 3.135125 s

2017-11-16 16:14:04,559 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 ERROR io.SparkHadoopMapReduceWriter: Aborting job
job_20171116161401_0008.

2017-11-16 16:14:04,559 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0
(TID 1, localhost, executor driver): java.lang.IllegalArgumentException:
Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,559 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Note:
To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

2017-11-16 16:14:04,559 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

2017-11-16 16:14:04,559 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

2017-11-16 16:14:04,559 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Driver
stacktrace:

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGSchedu
ler$$failJobAndIndependentStages(DAGScheduler.scala:1499)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1487)

2017-11-16 16:14:04,560 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1486)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at scala.Option.foreach(Option.scala:257)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.sca
la:814)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGSched
uler.scala:1714)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1669)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1658)

2017-11-16 16:14:04,561 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.internal.io.SparkHadoopMapReduceWriter$.write(SparkHadoopMa
pReduceWriter.scala:88)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply$mcV$sp(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:15
1)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:11
2)

2017-11-16 16:14:04,562 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunct
ions.scala:1084)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(JavaPairRDD.
scala:831)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(SparkCubingByLay
er.java:238)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.
java:192)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication
.java:37)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62
)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.reflect.Method.invoke(Method.java:498)

2017-11-16 16:14:04,563 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$ru
nMain(SparkSubmit.scala:755)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Caused
by: java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Note:
To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

2017-11-16 16:14:04,564 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

2017-11-16 16:14:04,565 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

2017-11-16 16:14:04,565 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

2017-11-16 16:14:04,565 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

2017-11-16 16:14:04,565 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,570 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
Exception in thread "main" java.lang.RuntimeException: error execute
org.apache.kylin.engine.spark.SparkCubingByLayer

2017-11-16 16:14:04,570 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication
.java:42)

2017-11-16 16:14:04,570 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62
)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.reflect.Method.invoke(Method.java:498)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$ru
nMain(SparkSubmit.scala:755)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Caused
by: org.apache.spark.SparkException: Job aborted.

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.internal.io.SparkHadoopMapReduceWriter$.write(SparkHadoopMa
pReduceWriter.scala:107)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply$mcV$sp(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:15
1)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:11
2)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

2017-11-16 16:14:04,571 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunct
ions.scala:1084)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(JavaPairRDD.
scala:831)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(SparkCubingByLay
er.java:238)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.
java:192)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication
.java:37)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
... 10 more

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Caused
by: org.apache.spark.SparkException: Job aborted due to stage failure: Task
0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage
1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentException:
Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Note:
To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

2017-11-16 16:14:04,572 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Driver
stacktrace:

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGSchedu
ler$$failJobAndIndependentStages(DAGScheduler.scala:1499)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1487)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1486)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at scala.Option.foreach(Option.scala:257)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.sca
la:814)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGSched
uler.scala:1714)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1669)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1658)

2017-11-16 16:14:04,573 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.internal.io.SparkHadoopMapReduceWriter$.write(SparkHadoopMa
pReduceWriter.scala:88)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
... 21 more

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Caused
by: java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : Note:
To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

2017-11-16 16:14:04,574 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,575 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO spark.SparkContext: Invoking stop() from shutdown
hook

2017-11-16 16:14:04,579 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO server.AbstractConnector: Stopped
Spark@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

2017-11-16 16:14:04,581 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO ui.SparkUI: Stopped Spark web UI at
http://192.168.1.135:4040

2017-11-16 16:14:04,636 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!

2017-11-16 16:14:04,643 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO memory.MemoryStore: MemoryStore cleared

2017-11-16 16:14:04,644 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO storage.BlockManager: BlockManager stopped

2017-11-16 16:14:04,649 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO storage.BlockManagerMaster: BlockManagerMaster
stopped

2017-11-16 16:14:04,651 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!

2017-11-16 16:14:04,653 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO spark.SparkContext: Successfully stopped SparkContext

2017-11-16 16:14:04,653 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO util.ShutdownHookManager: Shutdown hook called

2017-11-16 16:14:04,654 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 :
17/11/16 16:14:04 INFO util.ShutdownHookManager: Deleting directory
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6

2017-11-16 16:14:05,140 ERROR [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:156 : error
run spark job:

java.io.IOException: OS command error exit with return code: 1, error
message: SparkEntry args:-className
org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb -output
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/ -segmentId
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

Abstract Application args:-hiveTable
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb -output
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/ -segmentId
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

17/11/16 16:13:51 INFO spark.SparkContext: Running Spark version 2.2.0

17/11/16 16:13:52 INFO spark.SparkContext: Submitted application: Cubing
for:test_sample_cube segment d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls to: hdfs

17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls to: hdfs

17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls groups to: 

17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls groups
to: 

17/11/16 16:13:52 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users  with view permissions:
Set(hdfs); groups with view permissions: Set(); users  with modify
permissions: Set(hdfs); groups with modify permissions: Set()

17/11/16 16:13:52 INFO util.Utils: Successfully started service
'sparkDriver' on port 42799.

17/11/16 16:13:52 INFO spark.SparkEnv: Registering MapOutputTracker

17/11/16 16:13:52 INFO spark.SparkEnv: Registering BlockManagerMaster

17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: Using
org.apache.spark.storage.DefaultTopologyMapper for getting topology
information

17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint:
BlockManagerMasterEndpoint up

17/11/16 16:13:52 INFO storage.DiskBlockManager: Created local directory at
/tmp/blockmgr-b8d6ec0d-8a73-4ce6-9dbf-64002d5e2a62

17/11/16 16:13:52 INFO memory.MemoryStore: MemoryStore started with capacity
366.3 MB

17/11/16 16:13:52 INFO spark.SparkEnv: Registering OutputCommitCoordinator

17/11/16 16:13:52 INFO util.log: Logging initialized @2149ms

17/11/16 16:13:52 INFO server.Server: jetty-9.3.z-SNAPSHOT

17/11/16 16:13:52 INFO server.Server: Started @2235ms

17/11/16 16:13:52 INFO server.AbstractConnector: Started
ServerConnector@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

17/11/16 16:13:52 INFO util.Utils: Successfully started service 'SparkUI' on
port 4040.

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@f5c79a6{/jobs,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1fc793c2{/jobs/json,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@329a1243{/jobs/job,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@27f9e982{/jobs/job/json,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@37d3d232{/stages,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@581d969c{/stages/json,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2b46a8c1{/stages/stage,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5851bd4f{/stages/stage/json,null,AVAILABLE,@Sp
ark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2f40a43{/stages/pool,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@69c43e48{/stages/pool/json,null,AVAILABLE,@Spa
rk}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@3a80515c{/storage,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1c807b1d{/storage/json,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1b39fd82{/storage/rdd,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@21680803{/storage/rdd/json,null,AVAILABLE,@Spa
rk}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@c8b96ec{/environment,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2d8f2f3a{/environment/json,null,AVAILABLE,@Spa
rk}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@7048f722{/executors,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@58a55449{/executors/json,null,AVAILABLE,@Spark
}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@6e0ff644{/executors/threadDump,null,AVAILABLE,
@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2a2bb0eb{/executors/threadDump/json,null,AVAIL
ABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2d0566ba{/static,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@29d2d081{/,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@58783f6c{/api,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@88d6f9b{/jobs/job/kill,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@475b7792{/stages/stage/kill,null,AVAILABLE,@Sp
ark}

17/11/16 16:13:52 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at
http://192.168.1.135:4040

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.jar at
spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with
timestamp 1510829032976

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/hdp/2.4.3.0-227/hbase/lib/metrics-core-2.2.0.jar at
spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp
1510829032977

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/hdp/2.4.3.0-227/hbase/lib/guava-12.0.1.jar at
spark://192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp
1510829032977

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR
file:/usr/local/kylin/lib/kylin-job-2.2.0.jar at
spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp
1510829032978

17/11/16 16:13:53 INFO executor.Executor: Starting executor ID driver on
host localhost

17/11/16 16:13:53 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 33164.

17/11/16 16:13:53 INFO netty.NettyBlockTransferService: Server created on
192.168.1.135:33164

17/11/16 16:13:53 INFO storage.BlockManager: Using
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
policy

17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registering BlockManager
BlockManagerId(driver, 192.168.1.135, 33164, None)

17/11/16 16:13:53 INFO storage.BlockManagerMasterEndpoint: Registering block
manager 192.168.1.135:33164 with 366.3 MB RAM, BlockManagerId(driver,
192.168.1.135, 33164, None)

17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registered BlockManager
BlockManagerId(driver, 192.168.1.135, 33164, None)

17/11/16 16:13:53 INFO storage.BlockManager: Initialized BlockManager:
BlockManagerId(driver, 192.168.1.135, 33164, None)

17/11/16 16:13:53 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@1b9c1b51{/metrics/json,null,AVAILABLE,@Spark}

17/11/16 16:13:54 INFO scheduler.EventLoggingListener: Logging events to
hdfs:///kylin/spark-history/local-1510829033012

17/11/16 16:13:54 INFO common.AbstractHadoopJob: Ready to load KylinConfig
from uri:
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO cube.CubeManager: Initializing CubeManager with
config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO persistence.ResourceStore: Using metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

17/11/16 16:13:54 INFO hdfs.HDFSResourceStore: hdfs meta path :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb

17/11/16 16:13:54 INFO cube.CubeManager: Loading Cube from folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube

17/11/16 16:13:54 INFO cube.CubeDescManager: Initializing CubeDescManager
with config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO cube.CubeDescManager: Reloading Cube Metadata from
folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_desc

17/11/16 16:13:54 INFO project.ProjectManager: Initializing ProjectManager
with metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: Checking custom measure
types from kylin config

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
COUNT_DISTINCT(hllc), class
org.apache.kylin.measure.hllc.HLLCMeasureType$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
COUNT_DISTINCT(bitmap), class
org.apache.kylin.measure.bitmap.BitmapMeasureType$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering TOP_N(topn),
class org.apache.kylin.measure.topn.TopNMeasureType$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering RAW(raw),
class org.apache.kylin.measure.raw.RawMeasureType$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
EXTENDED_COLUMN(extendedcolumn), class
org.apache.kylin.measure.extendedcolumn.ExtendedColumnMeasureType$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering
PERCENTILE(percentile), class
org.apache.kylin.measure.percentile.PercentileMeasureType$Factory

17/11/16 16:13:54 INFO metadata.MetadataManager: Reloading data model at
/model_desc/test_sample_model.json

17/11/16 16:13:54 INFO cube.CubeDescManager: Loaded 1 Cube(s)

17/11/16 16:13:54 INFO cube.CubeManager: Reloaded cube test_sample_cube
being CUBE[name=test_sample_cube] having 1 segments

17/11/16 16:13:54 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes

17/11/16 16:13:54 INFO spark.SparkCubingByLayer: RDD Output path:
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/

17/11/16 16:13:55 INFO spark.SparkCubingByLayer: All measure are normal (agg
on all cuboids) ? : true

17/11/16 16:13:55 INFO internal.SharedState: loading hive config file:
file:/usr/local/spark/conf/hive-site.xml

17/11/16 16:13:55 INFO internal.SharedState: spark.sql.warehouse.dir is not
set, but hive.metastore.warehouse.dir is set. Setting
spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir
('/apps/hive/warehouse').

17/11/16 16:13:55 INFO internal.SharedState: Warehouse path is
'/apps/hive/warehouse'.

17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@75cf0de5{/SQL,null,AVAILABLE,@Spark}

17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@468173fa{/SQL/json,null,AVAILABLE,@Spark}

17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@27e2287c{/SQL/execution,null,AVAILABLE,@Spark}

17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2cd388f5{/SQL/execution/json,null,AVAILABLE,@S
park}

17/11/16 16:13:55 INFO handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@4207852d{/static/sql,null,AVAILABLE,@Spark}

17/11/16 16:13:56 INFO hive.HiveUtils: Initializing HiveMetastoreConnection
version 1.2.1 using Spark classes.

17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

17/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:57 INFO session.SessionState: Created local directory:
/tmp/58660d8b-48ac-4cf0-bd06-6b96018a5482_resources

17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482

17/11/16 16:13:57 INFO session.SessionState: Created local directory:
/tmp/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482

17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482/_tmp_space.db

17/11/16 16:13:57 INFO client.HiveClientImpl: Warehouse location for Hive
client (version 1.2.1) is /apps/hive/warehouse

17/11/16 16:13:57 INFO sqlstd.SQLStdHiveAccessController: Created
SQLStdHiveAccessController for session context : HiveAuthzSessionContext
[sessionString=58660d8b-48ac-4cf0-bd06-6b96018a5482, clientType=HIVECLI]

17/11/16 16:13:57 INFO hive.metastore: Mestastore configuration
hive.metastore.filter.hook changed from
org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaSto
reFilterHook

17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

17/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration
hive.metastore.filter.hook changed from
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaSto
reFilterHook to
org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl

17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO session.SessionState: Created local directory:
/tmp/bd69eb21-01c1-4dd3-b31c-16e065ab4101_resources

17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101

17/11/16 16:13:58 INFO session.SessionState: Created local directory:
/tmp/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101

17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory:
/tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101/_tmp_space.db

17/11/16 16:13:58 INFO client.HiveClientImpl: Warehouse location for Hive
client (version 1.2.1) is /apps/hive/warehouse

17/11/16 16:13:58 INFO sqlstd.SQLStdHiveAccessController: Created
SQLStdHiveAccessController for session context : HiveAuthzSessionContext
[sessionString=bd69eb21-01c1-4dd3-b31c-16e065ab4101, clientType=HIVECLI]

17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration
hive.metastore.filter.hook changed from
org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to
org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaSto
reFilterHook

17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO state.StateStoreCoordinatorRef: Registered
StateStoreCoordinator endpoint

17/11/16 16:13:58 INFO execution.SparkSqlParser: Parsing command:
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb

17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with
URI thrift://master01.trinitymobility.local:9083

17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: timestamp

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: boolean

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int

17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0 stored as
values in memory (estimated size 373.5 KB, free 365.9 MB)

17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0_piece0 stored
as bytes in memory (estimated size 35.8 KB, free 365.9 MB)

17/11/16 16:14:00 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in
memory on 192.168.1.135:33164 (size: 35.8 KB, free: 366.3 MB)

17/11/16 16:14:00 INFO spark.SparkContext: Created broadcast 0 from 

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46-48f6-8531-
47afa69bea83.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-962
2-aeaf4fe878b6.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7-6190-4951-
afd7-163137f3988e.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0-4074-9a6
f-05977b6e3260.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8
a7a-0e1e4ef7dcd3.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58-19b3-46cb-
bcf7-64d1b7b10fe0.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9-4d28-81d
c-fdac31787942.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a
222-954e1c13b537.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386-68e1-4e
82-8852-ab9bf2a6a114.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db
12084dea3.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7b
c1693c3f09.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933-8229-46c0-
bb82-fd0cf01bede5.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-a2f9-403c-9
b0b-64a7b03f8f84.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5
a-ab01-6e5ac7152672.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223
ee929068d.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677f-29b9-4952
-a7a5-c0119e3da826.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-
93bc6786a0e8.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df-
6e7511b0b57a.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6-
a2ff093aaf56.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641
496.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8ed-e18a5a4e
ba5a.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-11
1fe71bd39d.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d
.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92
ad38ef16d0.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c4024726-85bb-484
c-b5ca-4f1c2fb4dec0.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55ee3ecbf.dic
t

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208
.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b72e-696c131
dbe2d.dict

17/11/16 16:14:01 INFO common.CubeStatsReader: Estimating size for layer 0,
all cuboids are 536870911, total size is 0.010198831558227539

17/11/16 16:14:01 INFO spark.SparkCubingByLayer: Partition for spark cubing:
1

17/11/16 16:14:01 INFO output.FileOutputCommitter: File Output Committer
Algorithm version is 1

17/11/16 16:14:01 INFO spark.SparkContext: Starting job: runJob at
SparkHadoopMapReduceWriter.scala:88

17/11/16 16:14:01 INFO mapred.FileInputFormat: Total input paths to process
: 1

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Registering RDD 6 (mapToPair
at SparkCubingByLayer.java:170)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Got job 0 (runJob at
SparkHadoopMapReduceWriter.scala:88) with 1 output partitions

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Final stage: ResultStage 1
(runJob at SparkHadoopMapReduceWriter.scala:88)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Parents of final stage:
List(ShuffleMapStage 0)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Missing parents:
List(ShuffleMapStage 0)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0
(MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170), which has
no missing parents

17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1 stored as
values in memory (estimated size 25.8 KB, free 365.9 MB)

17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1_piece0 stored
as bytes in memory (estimated size 10.7 KB, free 365.9 MB)

17/11/16 16:14:01 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in
memory on 192.168.1.135:33164 (size: 10.7 KB, free: 366.3 MB)

17/11/16 16:14:01 INFO spark.SparkContext: Created broadcast 1 from
broadcast at DAGScheduler.scala:1006

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
from ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at
SparkCubingByLayer.java:170) (first 15 tasks are for partitions Vector(0))

17/11/16 16:14:01 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with
1 tasks

17/11/16 16:14:02 INFO scheduler.TaskSetManager: Starting task 0.0 in stage
0.0 (TID 0, localhost, executor driver, partition 0, ANY, 4978 bytes)

17/11/16 16:14:02 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID
0)

17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp
1510829032977

17/11/16 16:14:02 INFO client.TransportClientFactory: Successfully created
connection to /192.168.1.135:42799 after 64 ms (0 ms spent in bootstraps)

17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp5518529699147501519.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/metrics-core-2.2.0.jar to class loader

17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp
1510829032977

17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/guava-12.0.1.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp2368368452706093062.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/guava-12.0.1.jar to class loader

17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with
timestamp 1510829032976

17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp4539374910339958167.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/htrace-core-3.1.0-incubating.jar to class loader

17/11/16 16:14:02 INFO executor.Executor: Fetching
spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp
1510829032978

17/11/16 16:14:02 INFO util.Utils: Fetching
spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar to
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1
-bfd5-e3459b0dc20e/fetchFileTemp9086394010889635270.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding
file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5561
-48d1-bfd5-e3459b0dc20e/kylin-job-2.2.0.jar to class loader

17/11/16 16:14:02 INFO rdd.HadoopRDD: Input split:
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc
00dbb/000000_0:0+19534

17/11/16 16:14:02 INFO zlib.ZlibFactory: Successfully loaded & initialized
native-zlib library

17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor
[.deflate]

17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 251.01178 ms

17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 55.530064 ms

17/11/16 16:14:03 INFO common.AbstractHadoopJob: Ready to load KylinConfig
from uri:
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 INFO cube.CubeManager: Initializing CubeManager with
config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 INFO persistence.ResourceStore: Using metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

17/11/16 16:14:03 INFO hdfs.HDFSResourceStore: hdfs meta path :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb

17/11/16 16:14:03 INFO cube.CubeManager: Loading Cube from folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube

17/11/16 16:14:03 INFO cube.CubeDescManager: Initializing CubeDescManager
with config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 INFO cube.CubeDescManager: Reloading Cube Metadata from
folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_desc

17/11/16 16:14:03 INFO project.ProjectManager: Initializing ProjectManager
with metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 WARN cachesync.Broadcaster: More than one singleton exist

17/11/16 16:14:03 WARN project.ProjectManager: More than one singleton exist

17/11/16 16:14:03 INFO metadata.MetadataManager: Reloading data model at
/model_desc/test_sample_model.json

17/11/16 16:14:03 WARN metadata.MetadataManager: More than one singleton
exist, current keys: 1464031233,1545268424

17/11/16 16:14:03 INFO cube.CubeDescManager: Loaded 1 Cube(s)

17/11/16 16:14:03 WARN cube.CubeDescManager: More than one singleton exist

17/11/16 16:14:03 INFO cube.CubeManager: Reloaded cube test_sample_cube
being CUBE[name=test_sample_cube] having 1 segments

17/11/16 16:14:03 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes

17/11/16 16:14:03 WARN cube.CubeManager: More than one singleton exist

17/11/16 16:14:03 WARN cube.CubeManager: type: class
org.apache.kylin.common.KylinConfig reference: 1464031233

17/11/16 16:14:03 WARN cube.CubeManager: type: class
org.apache.kylin.common.KylinConfig reference: 1545268424

17/11/16 16:14:03 WARN dict.DictionaryManager: More than one singleton exist

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46-48f6-8531-
47afa69bea83.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-962
2-aeaf4fe878b6.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7-6190-4951-
afd7-163137f3988e.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0-4074-9a6
f-05977b6e3260.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8
a7a-0e1e4ef7dcd3.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58-19b3-46cb-
bcf7-64d1b7b10fe0.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9-4d28-81d
c-fdac31787942.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a
222-954e1c13b537.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386-68e1-4e
82-8852-ab9bf2a6a114.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db
12084dea3.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7b
c1693c3f09.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933-8229-46c0-
bb82-fd0cf01bede5.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-a2f9-403c-9
b0b-64a7b03f8f84.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5
a-ab01-6e5ac7152672.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223
ee929068d.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677f-29b9-4952
-a7a5-c0119e3da826.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-
93bc6786a0e8.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df-
6e7511b0b57a.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6-
a2ff093aaf56.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641
496.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8ed-e18a5a4e
ba5a.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-11
1fe71bd39d.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d
.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92
ad38ef16d0.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c4024726-85bb-484
c-b5ca-4f1c2fb4dec0.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55ee3ecbf.dic
t

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208
.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557)
loading DictionaryInfo(loadDictObj:true) at
/dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b72e-696c131
dbe2d.dict

17/11/16 16:14:04 INFO executor.Executor: Finished task 0.0 in stage 0.0
(TID 0). 1347 bytes result sent to driver

17/11/16 16:14:04 INFO scheduler.TaskSetManager: Finished task 0.0 in stage
0.0 (TID 0) in 2084 ms on localhost (executor driver) (1/1)

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0,
whose tasks have all completed, from pool 

17/11/16 16:14:04 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (mapToPair
at SparkCubingByLayer.java:170) finished in 2.118 s

17/11/16 16:14:04 INFO scheduler.DAGScheduler: looking for newly runnable
stages

17/11/16 16:14:04 INFO scheduler.DAGScheduler: running: Set()

17/11/16 16:14:04 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)

17/11/16 16:14:04 INFO scheduler.DAGScheduler: failed: Set()

17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting ResultStage 1
(MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:238), which has
no missing parents

17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2 stored as
values in memory (estimated size 82.3 KB, free 365.8 MB)

17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2_piece0 stored
as bytes in memory (estimated size 31.6 KB, free 365.8 MB)

17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in
memory on 192.168.1.135:33164 (size: 31.6 KB, free: 366.2 MB)

17/11/16 16:14:04 INFO spark.SparkContext: Created broadcast 2 from
broadcast at DAGScheduler.scala:1006

17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting 1 missing tasks
from ResultStage 1 (MapPartitionsRDD[8] at mapToPair at
SparkCubingByLayer.java:238) (first 15 tasks are for partitions Vector(0))

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with
1 tasks

17/11/16 16:14:04 INFO scheduler.TaskSetManager: Starting task 0.0 in stage
1.0 (TID 1, localhost, executor driver, partition 0, ANY, 4621 bytes)

17/11/16 16:14:04 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID
1)

17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Getting 1
non-empty blocks out of 1 blocks

17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote
fetches in 6 ms

17/11/16 16:14:04 INFO memory.MemoryStore: Block rdd_7_0 stored as bytes in
memory (estimated size 49.2 KB, free 365.7 MB)

17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added rdd_7_0 in memory on
192.168.1.135:33164 (size: 49.2 KB, free: 366.2 MB)

17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer
Algorithm version is 1

17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer
Algorithm version is 1

17/11/16 16:14:04 INFO common.AbstractHadoopJob: Ready to load KylinConfig
from uri:
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:04 INFO cube.CubeDescManager: Initializing CubeDescManager
with config
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:04 INFO persistence.ResourceStore: Using metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

17/11/16 16:14:04 INFO hdfs.HDFSResourceStore: hdfs meta path :
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb

17/11/16 16:14:04 INFO cube.CubeDescManager: Reloading Cube Metadata from
folder
hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-f
c5f1cc00dbb/cube_desc

17/11/16 16:14:04 INFO project.ProjectManager: Initializing ProjectManager
with metadata url
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:04 WARN cachesync.Broadcaster: More than one singleton exist

17/11/16 16:14:04 WARN project.ProjectManager: More than one singleton exist

17/11/16 16:14:04 INFO metadata.MetadataManager: Reloading data model at
/model_desc/test_sample_model.json

17/11/16 16:14:04 WARN metadata.MetadataManager: More than one singleton
exist, current keys: 1464031233,1545268424,1474775600

17/11/16 16:14:04 INFO cube.CubeDescManager: Loaded 1 Cube(s)

17/11/16 16:14:04 WARN cube.CubeDescManager: More than one singleton exist

17/11/16 16:14:04 INFO output.FileOutputCommitter: Saved output of task
'attempt_20171116161401_0001_r_000000_0' to
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/level_base_cuboid/_temporary/0/task_2017111
6161401_0001_r_000000

17/11/16 16:14:04 INFO mapred.SparkHadoopMapRedUtil:
attempt_20171116161401_0001_r_000000_0: Committed

17/11/16 16:14:04 ERROR executor.Executor: Exception in task 0.0 in stage
1.0 (TID 1)

java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

Note: To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

                at
com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

                at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

                at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

                at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

                at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

                at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

17/11/16 16:14:04 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 1.0
(TID 1, localhost, executor driver): java.lang.IllegalArgumentException:
Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

Note: To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

                at
com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

                at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

                at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

                at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

                at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

                at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

 

17/11/16 16:14:04 ERROR scheduler.TaskSetManager: Task 0 in stage 1.0 failed
1 times; aborting job

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0,
whose tasks have all completed, from pool 

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1

17/11/16 16:14:04 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at
SparkHadoopMapReduceWriter.scala:88) failed in 0.393 s due to Job aborted
due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent
failure: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver):
java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

Note: To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

                at
com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

                at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

                at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

                at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

                at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

                at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

 

Driver stacktrace:

17/11/16 16:14:04 INFO scheduler.DAGScheduler: Job 0 failed: runJob at
SparkHadoopMapReduceWriter.scala:88, took 3.135125 s

17/11/16 16:14:04 ERROR io.SparkHadoopMapReduceWriter: Aborting job
job_20171116161401_0008.

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0
(TID 1, localhost, executor driver): java.lang.IllegalArgumentException:
Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

Note: To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

                at
com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

                at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

                at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

                at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

                at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

                at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

 

Driver stacktrace:

                at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGSchedu
ler$$failJobAndIndependentStages(DAGScheduler.scala:1499)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1487)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1486)

                at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

                at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

                at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

                at scala.Option.foreach(Option.scala:257)

                at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.sca
la:814)

                at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGSched
uler.scala:1714)

                at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1669)

                at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1658)

                at
org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

                at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)

                at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)

                at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)

                at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)

                at
org.apache.spark.internal.io.SparkHadoopMapReduceWriter$.write(SparkHadoopMa
pReduceWriter.scala:88)

                at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply$mcV$sp(PairRDDFunctions.scala:1085)

                at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

                at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

                at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:15
1)

                at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:11
2)

                at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

                at
org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunct
ions.scala:1084)

                at
org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(JavaPairRDD.
scala:831)

                at
org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(SparkCubingByLay
er.java:238)

                at
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.
java:192)

                at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication
.java:37)

                at
org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)

                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)

                at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62
)

                at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)

                at java.lang.reflect.Method.invoke(Method.java:498)

                at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$ru
nMain(SparkSubmit.scala:755)

                at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

                at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)

                at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)

                at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

Note: To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

                at
com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

                at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

                at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

                at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

                at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

                at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

Exception in thread "main" java.lang.RuntimeException: error execute
org.apache.kylin.engine.spark.SparkCubingByLayer

                at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication
.java:42)

                at
org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)

                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)

                at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62
)

                at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
.java:43)

                at java.lang.reflect.Method.invoke(Method.java:498)

                at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$ru
nMain(SparkSubmit.scala:755)

                at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

                at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)

                at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)

                at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: org.apache.spark.SparkException: Job aborted.

                at
org.apache.spark.internal.io.SparkHadoopMapReduceWriter$.write(SparkHadoopMa
pReduceWriter.scala:107)

                at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply$mcV$sp(PairRDDFunctions.scala:1085)

                at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

                at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.a
pply(PairRDDFunctions.scala:1085)

                at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:15
1)

                at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:11
2)

                at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

                at
org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunct
ions.scala:1084)

                at
org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(JavaPairRDD.
scala:831)

                at
org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(SparkCubingByLay
er.java:238)

                at
org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.
java:192)

                at
org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication
.java:37)

                ... 10 more

Caused by: org.apache.spark.SparkException: Job aborted due to stage
failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task
0.0 in stage 1.0 (TID 1, localhost, executor driver):
java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

Note: To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

                at
com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

                at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

                at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

                at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

                at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

                at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

 

Driver stacktrace:

                at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGSchedu
ler$$failJobAndIndependentStages(DAGScheduler.scala:1499)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1487)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSched
uler.scala:1486)

                at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:5
9)

                at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)

                at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

                at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply
(DAGScheduler.scala:814)

                at scala.Option.foreach(Option.scala:257)

                at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.sca
la:814)

                at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGSched
uler.scala:1714)

                at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1669)

                at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGSchedul
er.scala:1658)

                at
org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

                at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)

                at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)

                at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)

                at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)

                at
org.apache.spark.internal.io.SparkHadoopMapReduceWriter$.write(SparkHadoopMa
pReduceWriter.scala:88)

                ... 21 more

Caused by: java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage

Note: To register this class use:
kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMess
age.class);

                at
com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)

                at
com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassR
esolver.java:97)

                at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)

                at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)

                at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.
scala:315)

                at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

17/11/16 16:14:04 INFO spark.SparkContext: Invoking stop() from shutdown
hook

17/11/16 16:14:04 INFO server.AbstractConnector: Stopped
Spark@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

17/11/16 16:14:04 INFO ui.SparkUI: Stopped Spark web UI at
http://192.168.1.135:4040

17/11/16 16:14:04 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!

17/11/16 16:14:04 INFO memory.MemoryStore: MemoryStore cleared

17/11/16 16:14:04 INFO storage.BlockManager: BlockManager stopped

17/11/16 16:14:04 INFO storage.BlockManagerMaster: BlockManagerMaster
stopped

17/11/16 16:14:04 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!

17/11/16 16:14:04 INFO spark.SparkContext: Successfully stopped SparkContext

17/11/16 16:14:04 INFO util.ShutdownHookManager: Shutdown hook called

17/11/16 16:14:04 INFO util.ShutdownHookManager: Deleting directory
/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6

The command is: 

export HADOOP_CONF_DIR=/usr/local/kylin/hadoop-conf &&
/usr/local/kylin/spark/bin/spark-submit --class
org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1
--conf spark.yarn.archive=hdfs://trinitybdhdfs/kylin/spark/spark-libs.jar
--conf spark.yarn.queue=default  --conf
spark.yarn.am.extraJavaOptions=-Dhdp.version=2.4.3.0-227  --conf
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf
spark.driver.extraJavaOptions=-Dhdp.version=2.4.3.0-227  --conf
spark.master=local[*]  --conf
spark.executor.extraJavaOptions=-Dhdp.version=2.4.3.0-227  --conf
spark.hadoop.yarn.timeline-service.enabled=false  --conf
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf
spark.executor.cores=2 --jars
/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4
.3.0-227/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.3.0-227/hbase/lib/gua
va-12.0.1.jar, /usr/local/kylin/lib/kylin-job-2.2.0.jar -className
org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable
default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc0
0dbb -output
hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-8142
06fb79e3/test_sample_cube/cuboid/ -segmentId
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl
kylin_metadata@hdfs,path=hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/
d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

                at
org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.j
ava:92)

                at
org.apache.kylin.engine.spark.SparkExecutable.doWork(SparkExecutable.java:15
2)

                at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable
.java:125)

                at
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChaine
dExecutable.java:64)

                at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable
.java:125)

                at
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultS
cheduler.java:144)

                at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
42)

                at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
17)

                at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:05,169 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 :
job id:26342fa2-68ac-48e4-9eea-814206fb79e3-06 from RUNNING to ERROR

2017-11-16 16:14:05,217 INFO  [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 :
job id:26342fa2-68ac-48e4-9eea-814206fb79e3 from RUNNING to ERROR

2017-11-16 16:14:05,217 DEBUG [Scheduler 1211098754 Job
26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable:259 :
no need to send email, user list is empty

2017-11-16 16:14:05,226 INFO  [pool-8-thread-1]
threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 0 actual
running, 0 stopped, 0 ready, 3 already succeed, 1 error, 1 discarded, 0
others

2017-11-16 16:14:16,344 INFO  [pool-8-thread-1]
threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 0 actual
running, 0 stopped, 0 ready, 3 already succeed, 1 error, 1 discarded, 0
others

 

 


Mime
View raw message