Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 4000B200D3E for ; Thu, 16 Nov 2017 14:26:54 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 3DE1D160BE6; Thu, 16 Nov 2017 13:26:54 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 43E3B160BE5 for ; Thu, 16 Nov 2017 14:26:49 +0100 (CET) Received: (qmail 73034 invoked by uid 500); 16 Nov 2017 13:26:48 -0000 Mailing-List: contact user-help@kylin.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@kylin.apache.org Delivered-To: mailing list user@kylin.apache.org Received: (qmail 73024 invoked by uid 99); 16 Nov 2017 13:26:48 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 16 Nov 2017 13:26:48 +0000 Received: from mail-pg0-f52.google.com (mail-pg0-f52.google.com [74.125.83.52]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id CF9551A006D for ; Thu, 16 Nov 2017 13:26:47 +0000 (UTC) Received: by mail-pg0-f52.google.com with SMTP id j16so15059916pgn.9 for ; Thu, 16 Nov 2017 05:26:47 -0800 (PST) X-Gm-Message-State: AJaThX6WJe2KLQ3LQpizMyYvvPQdI72rJZys7hWMK7odCgvSpdEY6RK7 kUfq4EacIAfLzluAKcX1GIYlkXm6EzozF98O6HU= X-Google-Smtp-Source: AGs4zMYoTEW478SYUwgI59SA9pTCmR8u0G3d7ztIVdHgEki84UgfW526onWh93xBvNQPytYX/9HWeA+1XgRM+Qq5O6w= X-Received: by 10.99.53.14 with SMTP id c14mr1674068pga.440.1510838802094; Thu, 16 Nov 2017 05:26:42 -0800 (PST) MIME-Version: 1.0 Received: by 10.100.162.132 with HTTP; Thu, 16 Nov 2017 05:26:00 -0800 (PST) In-Reply-To: <5a0d77a5.0b64650a.24b1f.ae3aSMTPIN_ADDED_BROKEN@mx.google.com> References: <5a0d77a5.0b64650a.24b1f.ae3aSMTPIN_ADDED_BROKEN@mx.google.com> From: ShaoFeng Shi Date: Thu, 16 Nov 2017 21:26:00 +0800 X-Gmail-Original-Message-ID: Message-ID: Subject: Re: Kylin 2.2.0 is failed at 7th step. To: user Content-Type: multipart/alternative; boundary="94eb2c1945161ca3c7055e199213" archived-at: Thu, 16 Nov 2017 13:26:54 -0000 --94eb2c1945161ca3c7055e199213 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable Are you running Spark 2.2? Kylin 2.2 supports Spark 2.1.1; Please use the embedded Spark by setting SPARK_HOME to KYLIN_HOME/spark. 2017-11-16 19:33 GMT+08:00 Prasanna : > Hi all, > > > > I installed 2.2.0 by following http://kylin.apache.org/ > docs21/tutorial/cube_spark.html .Kylin > > service is started successfully. I tried to build kylin cube on spark > engine, but its failed at 7th step build cube with spark engine. Please > suggest me how to solve this problem.Therse are my logs. Please suggest m= e > its high priority for me. > > > > > > 2017-11-16 16:13:46,345 INFO [pool-8-thread-1] > threadpool.DefaultScheduler:113 : CubingJob{id=3D26342fa2-68ac-48e4-9eea-= 814206fb79e3, > name=3DBUILD CUBE - test_sample_cube - 20160101120000_20171114140000 - > GMT+08:00 2017-11-15 21:02:27, state=3DREADY} prepare to schedule > > 2017-11-16 16:13:46,346 INFO [pool-8-thread-1] > threadpool.DefaultScheduler:116 : CubingJob{id=3D26342fa2-68ac-48e4-9eea-= 814206fb79e3, > name=3DBUILD CUBE - test_sample_cube - 20160101120000_20171114140000 - > GMT+08:00 2017-11-15 21:02:27, state=3DREADY} scheduled > > 2017-11-16 16:13:46,346 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable:11= 1 > : Executing AbstractExecutable (BUILD CUBE - test_sample_cube - > 20160101120000_20171114140000 - GMT+08:00 2017-11-15 21:02:27) > > 2017-11-16 16:13:46,349 INFO [pool-8-thread-1] > threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 1 actual > running, 0 stopped, 1 ready, 3 already succeed, 0 error, 1 discarded, 0 > others > > 2017-11-16 16:13:46,360 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 > : job id:26342fa2-68ac-48e4-9eea-814206fb79e3 from READY to RUNNING > > 2017-11-16 16:13:46,373 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable:11= 1 > : Executing AbstractExecutable (Build Cube with Spark) > > 2017-11-16 16:13:46,385 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 > : job id:26342fa2-68ac-48e4-9eea-814206fb79e3-06 from READY to RUNNING > > 2017-11-16 16:13:46,399 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:76 : > SPARK_HOME was set to /usr/local/kylin/spark > > 2017-11-16 16:13:46,399 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:120 : > Using /usr/local/kylin/hadoop-conf as HADOOP_CONF_DIR > > 2017-11-16 16:13:46,900 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:162 : > Kylin Config was updated with kylin.metadata.url : /usr/local/kylin/bin/.= ./ > tomcat/temp/kylin_job_meta5483749809080231586/meta > > 2017-11-16 16:13:46,901 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79 : > Using metadata url /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483= 749809080231586/meta > for resource store > > 2017-11-16 16:13:47,038 WARN [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] util.HeapMemorySizeUtil:55 : > hbase.regionserver.global.memstore.upperLimit is deprecated by > hbase.regionserver.global.memstore.size > > 2017-11-16 16:13:47,103 DEBUG [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.JobRelatedMetaUtil:70 : > Dump resources to /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta54837= 49809080231586/meta > took 203 ms > > 2017-11-16 16:13:47,105 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:162 : > Kylin Config was updated with kylin.metadata.url : /usr/local/kylin/bin/.= ./ > tomcat/temp/kylin_job_meta5483749809080231586/meta > > 2017-11-16 16:13:47,105 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79 : > Using metadata url /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483= 749809080231586/meta > for resource store > > 2017-11-16 16:13:47,105 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79 : > Using metadata url kylin_metadata@hdfs,path=3Dhdfs: > //trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5= f1cc00dbb > for resource store > > 2017-11-16 16:13:47,155 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:76 : > hdfs meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:13:47,157 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceTool:167 : > Copy from /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809080= 231586/meta > to org.apache.kylin.storage.hdfs.HDFSResourceStore@2f8908ea > > 2017-11-16 16:13:47,157 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /cube/test_sample_cube.json > > 2017-11-16 16:13:47,157 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube/test_sample_cube.json > > 2017-11-16 16:13:47,197 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /cube_desc/test_sample_cube.json > > 2017-11-16 16:13:47,197 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc/test_sample_cube.json > > 2017-11-16 16:13:47,320 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /cube_statistics/test_sample_cube/d4ccd867-e0ae-4ec2-b2ff- > fc5f1cc00dbb.seq > > 2017-11-16 16:13:47,320 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_statistics/ > test_sample_cube/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb.seq > > 2017-11-16 16:13:48,998 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a- > 43cd-a8ed-e18a5a4eba5a.dict > > 2017-11-16 16:13:48,998 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8ed-e18a5a4eba5a.dict > > 2017-11-16 16:13:49,031 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885- > 3299-4912-8570-111fe71bd39d.dict > > 2017-11-16 16:13:49,031 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-111fe71bd39d.dict > > 2017-11-16 16:13:49,064 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092- > 4de55ee3ecbf.dict > > 2017-11-16 16:13:49,064 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55ee3ecbf.dict > > 2017-11-16 16:13:49,097 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/ > c4024726-85bb-484c-b5ca-4f1c2fb4dec0.dict > > 2017-11-16 16:13:49,098 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c4024726-85bb- > 484c-b5ca-4f1c2fb4dec0.dict > > 2017-11-16 16:13:49,131 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d- > 42d0-b72e-696c131dbe2d.dict > > 2017-11-16 16:13:49,131 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b72e-696c131dbe2d.dict > > 2017-11-16 16:13:49,164 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c- > 8f05-9a43099c1208.dict > > 2017-11-16 16:13:49,164 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208.dict > > 2017-11-16 16:13:49,197 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661- > 9c34-f6dc6e6db62d.dict > > 2017-11-16 16:13:49,198 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d.dict > > 2017-11-16 16:13:49,231 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393- > 9f8c-49c6-b640-92ad38ef16d0.dict > > 2017-11-16 16:13:49,231 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92ad38ef16d0.dict > > 2017-11-16 16:13:49,306 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb- > 4b2d-8745-8af522641496.dict > > 2017-11-16 16:13:49,306 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641496.dict > > 2017-11-16 16:13:49,339 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/ > 2c7d1eea-8a55-412e-b7d6-a2ff093aaf56.dict > > 2017-11-16 16:13:49,339 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6- > a2ff093aaf56.dict > > 2017-11-16 16:13:49,372 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/ > 44c5ee32-f62c-4d20-a222-954e1c13b537.dict > > 2017-11-16 16:13:49,373 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20- > a222-954e1c13b537.dict > > 2017-11-16 16:13:49,406 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/ > 19c299e7-6190-4951-afd7-163137f3988e.dict > > 2017-11-16 16:13:49,406 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7-6190-4951- > afd7-163137f3988e.dict > > 2017-11-16 16:13:49,439 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ > ab4a3960-ade3-4537-8198-93bc6786a0e8.dict > > 2017-11-16 16:13:49,439 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3- > 4537-8198-93bc6786a0e8.dict > > 2017-11-16 16:13:49,472 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/ > 328f7642-2092-4d4c-83df-6e7511b0b57a.dict > > 2017-11-16 16:13:49,473 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df- > 6e7511b0b57a.dict > > 2017-11-16 16:13:49,506 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/ > f8564625-7ec0-4074-9a6f-05977b6e3260.dict > > 2017-11-16 16:13:49,506 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0- > 4074-9a6f-05977b6e3260.dict > > 2017-11-16 16:13:49,539 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/ > 739191fd-42e4-4685-8a7a-0e1e4ef7dcd3.dict > > 2017-11-16 16:13:49,539 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4- > 4685-8a7a-0e1e4ef7dcd3.dict > > 2017-11-16 16:13:49,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/ > 8a32cc58-19b3-46cb-bcf7-64d1b7b10fe0.dict > > 2017-11-16 16:13:49,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58- > 19b3-46cb-bcf7-64d1b7b10fe0.dict > > 2017-11-16 16:13:49,606 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/ > cab12a4e-2ec9-4d28-81dc-fdac31787942.dict > > 2017-11-16 16:13:49,606 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9- > 4d28-81dc-fdac31787942.dict > > 2017-11-16 16:13:49,639 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/ > d5316933-8229-46c0-bb82-fd0cf01bede5.dict > > 2017-11-16 16:13:49,639 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933- > 8229-46c0-bb82-fd0cf01bede5.dict > > 2017-11-16 16:13:49,672 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/ > 391cbd21-cc46-48f6-8531-47afa69bea83.dict > > 2017-11-16 16:13:49,673 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46- > 48f6-8531-47afa69bea83.dict > > 2017-11-16 16:13:49,706 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_ > DISPLAY/ab477386-68e1-4e82-8852-ab9bf2a6a114.dict > > 2017-11-16 16:13:49,706 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386- > 68e1-4e82-8852-ab9bf2a6a114.dict > > 2017-11-16 16:13:49,739 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/ > 4eff7287-a2f9-403c-9b0b-64a7b03f8f84.dict > > 2017-11-16 16:13:49,739 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-a2f9- > 403c-9b0b-64a7b03f8f84.dict > > 2017-11-16 16:13:49,772 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/ > 0c92e610-ce7f-4553-9622-aeaf4fe878b6.dict > > 2017-11-16 16:13:49,773 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f- > 4553-9622-aeaf4fe878b6.dict > > 2017-11-16 16:13:49,806 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe- > 8b11-4fc1-9943-9db12084dea3.dict > > 2017-11-16 16:13:49,806 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db12084dea3.dict > > 2017-11-16 16:13:49,839 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/ > eb13b237-61a5-4421-b390-7bc1693c3f09.dict > > 2017-11-16 16:13:49,839 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7bc1693c3f09.dict > > 2017-11-16 16:13:49,872 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ > ae15677f-29b9-4952-a7a5-c0119e3da826.dict > > 2017-11-16 16:13:49,873 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677f-29b9-4952- > a7a5-c0119e3da826.dict > > 2017-11-16 16:13:49,906 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_ > DESCRIPTION/15e49d33-8260-4f5a-ab01-6e5ac7152672.dict > > 2017-11-16 16:13:49,906 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260- > 4f5a-ab01-6e5ac7152672.dict > > 2017-11-16 16:13:49,939 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be- > 3710-44d4-a85e-223ee929068d.dict > > 2017-11-16 16:13:49,939 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC. > V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223ee929068d.dict > > 2017-11-16 16:13:49,972 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /kylin.properties > > 2017-11-16 16:13:49,973 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/kylin.properties > > 2017-11-16 16:13:50,006 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /model_desc/test_sample_model.json > > 2017-11-16 16:13:50,006 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/model_desc/test_sample_model.json > > 2017-11-16 16:13:50,039 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /project/test_sample.json > > 2017-11-16 16:13:50,039 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/project/test_sample.json > > 2017-11-16 16:13:50,072 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /table/TRINITYICCC.INDEX_EVENT.json > > 2017-11-16 16:13:50,073 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/table/TRINITYICCC.INDEX_EVENT.json > > 2017-11-16 16:13:50,139 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /table/TRINITYICCC.PINMAPPING_FACT.json > > 2017-11-16 16:13:50,139 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/table/TRINITYICCC.PINMAPPING_FACT. > json > > 2017-11-16 16:13:50,181 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 : > res path : /table/TRINITYICCC.V_ANALYST_INCIDENTS.json > > 2017-11-16 16:13:50,181 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 : > put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/table/TRINITYICCC.V_ANALYST_ > INCIDENTS.json > > 2017-11-16 16:13:50,214 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:76 : > SPARK_HOME was set to /usr/local/kylin/spark > > 2017-11-16 16:13:50,215 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:149 : > cmd: export HADOOP_CONF_DIR=3D/usr/local/kylin/hadoop-conf && > /usr/local/kylin/spark/bin/spark-submit --class > org.apache.kylin.common.util.SparkEntry --conf > spark.executor.instances=3D1 --conf spark.yarn.archive=3Dhdfs:// > trinitybdhdfs/kylin/spark/spark-libs.jar --conf > spark.yarn.queue=3Ddefault --conf spark.yarn.am.extraJavaOptions=3D-Dhdp= .version=3D2.4.3.0-227 > --conf spark.history.fs.logDirectory=3Dhdfs:///kylin/spark-history --con= f > spark.driver.extraJavaOptions=3D-Dhdp.version=3D2.4.3.0-227 --conf > spark.master=3Dlocal[*] --conf spark.executor.extraJavaOptions=3D-Dhdp.v= ersion=3D2.4.3.0-227 > --conf spark.hadoop.yarn.timeline-service.enabled=3Dfalse --conf > spark.executor.memory=3D1G --conf spark.eventLog.enabled=3Dtrue --conf > spark.eventLog.dir=3Dhdfs:///kylin/spark-history --conf > spark.executor.cores=3D2 --jars /usr/hdp/2.4.3.0-227/hbase/ > lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.3. > 0-227/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.3.0- > 227/hbase/lib/guava-12.0.1.jar, /usr/local/kylin/lib/kylin-job-2.2.0.jar > -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable > default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1= cc00dbb > -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs > ,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb > -cubename test_sample_cube > > 2017-11-16 16:13:51,639 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > SparkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLay= er > -hiveTable default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2= _b2ff_fc5f1cc00dbb > -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs > ,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb > -cubename test_sample_cube > > 2017-11-16 16:13:51,649 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Abstract Application args:-hiveTable default.kylin_intermediate_ > test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc00dbb -output > hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs > ,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb > -cubename test_sample_cube > > 2017-11-16 16:13:51,725 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:51 INFO spark.SparkContext: Running Spark version 2.2.0 > > 2017-11-16 16:13:52,221 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkContext: Submitted application: Cubing > for:test_sample_cube segment d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:13:52,247 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls to: hdfs > > 2017-11-16 16:13:52,248 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls to: hd= fs > > 2017-11-16 16:13:52,248 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls groups t= o: > > 2017-11-16 16:13:52,249 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls groups > to: > > 2017-11-16 16:13:52,249 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SecurityManager: SecurityManager: > authentication disabled; ui acls disabled; users with view permissions: > Set(hdfs); groups with view permissions: Set(); users with modify > permissions: Set(hdfs); groups with modify permissions: Set() > > 2017-11-16 16:13:52,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO util.Utils: Successfully started service > 'sparkDriver' on port 42799. > > 2017-11-16 16:13:52,593 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkEnv: Registering MapOutputTracker > > 2017-11-16 16:13:52,613 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkEnv: Registering BlockManagerMaster > > 2017-11-16 16:13:52,616 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: Using > org.apache.spark.storage.DefaultTopologyMapper for getting topology > information > > 2017-11-16 16:13:52,617 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: > BlockManagerMasterEndpoint up > > 2017-11-16 16:13:52,634 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO storage.DiskBlockManager: Created local directory = at > /tmp/blockmgr-b8d6ec0d-8a73-4ce6-9dbf-64002d5e2a62 > > 2017-11-16 16:13:52,655 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO memory.MemoryStore: MemoryStore started with > capacity 366.3 MB > > 2017-11-16 16:13:52,712 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkEnv: Registering OutputCommitCoordinato= r > > 2017-11-16 16:13:52,790 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO util.log: Logging initialized @2149ms > > 2017-11-16 16:13:52,859 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO server.Server: jetty-9.3.z-SNAPSHOT > > 2017-11-16 16:13:52,875 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO server.Server: Started @2235ms > > 2017-11-16 16:13:52,896 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO server.AbstractConnector: Started > ServerConnector@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} > > 2017-11-16 16:13:52,897 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO util.Utils: Successfully started service 'SparkUI' > on port 4040. > > 2017-11-16 16:13:52,923 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@f5c79a6{/jobs,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,923 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1fc793c2{/jobs/json,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,924 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@329a1243{/jobs/job,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,925 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@27f9e982{/jobs/job/json,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:52,926 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@37d3d232{/stages,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,926 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@581d969c{/stages/json,null,AVAILABLE,@Spark= } > > 2017-11-16 16:13:52,927 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2b46a8c1{/stages/stage,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:52,928 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@5851bd4f{/stages/stage/json, > null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,929 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2f40a43{/stages/pool,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,930 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@69c43e48{/stages/pool/json, > null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,930 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@3a80515c{/storage,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,931 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1c807b1d{/storage/json,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:52,932 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1b39fd82{/storage/rdd,null,AVAILABLE,@Spark= } > > 2017-11-16 16:13:52,933 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@21680803{/storage/rdd/json, > null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,933 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@c8b96ec{/environment,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,934 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2d8f2f3a{/environment/json, > null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,935 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@7048f722{/executors,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,936 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@58a55449{/executors/json,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:52,936 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@6e0ff644{/executors/ > threadDump,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,937 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2a2bb0eb{/executors/threadDump/json,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:52,945 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2d0566ba{/static,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,946 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@29d2d081{/,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,948 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@58783f6c{/api,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,948 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@88d6f9b{/jobs/job/kill,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:52,949 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@475b7792{/stages/stage/kill, > null,AVAILABLE,@Spark} > > 2017-11-16 16:13:52,952 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started = at > http://192.168.1.135:4040 > > 2017-11-16 16:13:52,977 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.jar at > spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with > timestamp 1510829032976 > > 2017-11-16 16:13:52,977 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/hdp/2.4.3.0-227/hbase/lib/metrics-core-2.2.0.jar at spark:// > 192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp > 1510829032977 > > 2017-11-16 16:13:52,977 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/hdp/2.4.3.0-227/hbase/lib/guava-12.0.1.jar at spark:// > 192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp 1510829032977 > > 2017-11-16 16:13:52,978 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/local/kylin/lib/kylin-job-2.2.0.jar at spark:// > 192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp 1510829032978 > > 2017-11-16 16:13:53,042 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO executor.Executor: Starting executor ID driver on > host localhost > > 2017-11-16 16:13:53,061 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO util.Utils: Successfully started service > 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33164. > > 2017-11-16 16:13:53,066 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO netty.NettyBlockTransferService: Server created on > 192.168.1.135:33164 > > 2017-11-16 16:13:53,068 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO storage.BlockManager: Using org.apache.spark.stora= ge.RandomBlockReplicationPolicy > for block replication policy > > 2017-11-16 16:13:53,070 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registering BlockManag= er > BlockManagerId(driver, 192.168.1.135, 33164, None) > > 2017-11-16 16:13:53,073 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO storage.BlockManagerMasterEndpoint: Registering > block manager 192.168.1.135:33164 with 366.3 MB RAM, > BlockManagerId(driver, 192.168.1.135, 33164, None) > > 2017-11-16 16:13:53,076 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registered BlockManage= r > BlockManagerId(driver, 192.168.1.135, 33164, None) > > 2017-11-16 16:13:53,076 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO storage.BlockManager: Initialized BlockManager: > BlockManagerId(driver, 192.168.1.135, 33164, None) > > 2017-11-16 16:13:53,225 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:53 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1b9c1b51{/metrics/json,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:54,057 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO scheduler.EventLoggingListener: Logging events to > hdfs:///kylin/spark-history/local-1510829033012 > > 2017-11-16 16:13:54,093 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO common.AbstractHadoopJob: Ready to load KylinConfi= g > from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:13:54,250 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO cube.CubeManager: Initializing CubeManager with > config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:13:54,254 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO persistence.ResourceStore: Using metadata url > kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store > > 2017-11-16 16:13:54,282 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO hdfs.HDFSResourceStore: hdfs meta path : > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:13:54,295 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO cube.CubeManager: Loading Cube from folder > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube > > 2017-11-16 16:13:54,640 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO cube.CubeDescManager: Initializing CubeDescManager > with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:13:54,641 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO cube.CubeDescManager: Reloading Cube Metadata from > folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc > > 2017-11-16 16:13:54,705 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO project.ProjectManager: Initializing ProjectManage= r > with metadata url kylin_metadata@hdfs,path=3Dhdfs: > //trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867- > e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:13:54,761 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: Checking custom measur= e > types from kylin config > > 2017-11-16 16:13:54,762 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > COUNT_DISTINCT(hllc), class org.apache.kylin.measure.hllc. > HLLCMeasureType$Factory > > 2017-11-16 16:13:54,768 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > COUNT_DISTINCT(bitmap), class org.apache.kylin.measure. > bitmap.BitmapMeasureType$Factory > > 2017-11-16 16:13:54,774 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering TOP_N(topn= ), > class org.apache.kylin.measure.topn.TopNMeasureType$Factory > > 2017-11-16 16:13:54,776 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering RAW(raw), > class org.apache.kylin.measure.raw.RawMeasureType$Factory > > 2017-11-16 16:13:54,778 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > EXTENDED_COLUMN(extendedcolumn), class org.apache.kylin.measure. > extendedcolumn.ExtendedColumnMeasureType$Factory > > 2017-11-16 16:13:54,780 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > PERCENTILE(percentile), class org.apache.kylin.measure.percentile. > PercentileMeasureType$Factory > > 2017-11-16 16:13:54,800 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO metadata.MetadataManager: Reloading data model at > /model_desc/test_sample_model.json > > 2017-11-16 16:13:54,931 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO cube.CubeDescManager: Loaded 1 Cube(s) > > 2017-11-16 16:13:54,932 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO cube.CubeManager: Reloaded cube test_sample_cube > being CUBE[name=3Dtest_sample_cube] having 1 segments > > 2017-11-16 16:13:54,932 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes > > 2017-11-16 16:13:54,942 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:54 INFO spark.SparkCubingByLayer: RDD Output path: > hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ > > 2017-11-16 16:13:55,758 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO spark.SparkCubingByLayer: All measure are normal > (agg on all cuboids) ? : true > > 2017-11-16 16:13:55,868 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO internal.SharedState: loading hive config file: > file:/usr/local/spark/conf/hive-site.xml > > 2017-11-16 16:13:55,888 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO internal.SharedState: spark.sql.warehouse.dir is n= ot > set, but hive.metastore.warehouse.dir is set. Setting > spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir > ('/apps/hive/warehouse'). > > 2017-11-16 16:13:55,889 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO internal.SharedState: Warehouse path is > '/apps/hive/warehouse'. > > 2017-11-16 16:13:55,895 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@75cf0de5{/SQL,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:55,895 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@468173fa{/SQL/json,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:55,896 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@27e2287c{/SQL/execution,null, > AVAILABLE,@Spark} > > 2017-11-16 16:13:55,896 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2cd388f5{/SQL/execution/json, > null,AVAILABLE,@Spark} > > 2017-11-16 16:13:55,898 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@4207852d{/static/sql,null,AVAILABLE,@Spark} > > 2017-11-16 16:13:56,397 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:56 INFO hive.HiveUtils: Initializing HiveMetastoreConnecti= on > version 1.2.1 using Spark classes. > > 2017-11-16 16:13:57,548 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 2017-11-16 16:13:57,583 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO hive.metastore: Connected to metastore. > > 2017-11-16 16:13:57,709 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO session.SessionState: Created local directory: > /tmp/58660d8b-48ac-4cf0-bd06-6b96018a5482_resources > > 2017-11-16 16:13:57,732 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482 > > 2017-11-16 16:13:57,734 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO session.SessionState: Created local directory: > /tmp/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482 > > 2017-11-16 16:13:57,738 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482/_tmp_space.db > > 2017-11-16 16:13:57,740 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO client.HiveClientImpl: Warehouse location for Hive > client (version 1.2.1) is /apps/hive/warehouse > > 2017-11-16 16:13:57,751 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO sqlstd.SQLStdHiveAccessController: Created > SQLStdHiveAccessController for session context : HiveAuthzSessionContext > [sessionString=3D58660d8b-48ac-4cf0-bd06-6b96018a5482, clientType=3DHIVEC= LI] > > 2017-11-16 16:13:57,752 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO hive.metastore: Mestastore configuration > hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.= DefaultMetaStoreFilterHookImpl > to org.apache.hadoop.hive.ql.security.authorization.plugin. > AuthorizationMetaStoreFilterHook > > 2017-11-16 16:13:57,756 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 2017-11-16 16:13:57,757 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:57 INFO hive.metastore: Connected to metastore. > > 2017-11-16 16:13:58,073 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration > hive.metastore.filter.hook changed from org.apache.hadoop.hive.ql. > security.authorization.plugin.AuthorizationMetaStoreFilterHook to > org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl > > 2017-11-16 16:13:58,073 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 2017-11-16 16:13:58,075 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Connected to metastore. > > 2017-11-16 16:13:58,078 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO session.SessionState: Created local directory: > /tmp/bd69eb21-01c1-4dd3-b31c-16e065ab4101_resources > > 2017-11-16 16:13:58,088 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101 > > 2017-11-16 16:13:58,089 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO session.SessionState: Created local directory: > /tmp/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101 > > 2017-11-16 16:13:58,096 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101/_tmp_space.db > > 2017-11-16 16:13:58,097 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO client.HiveClientImpl: Warehouse location for Hive > client (version 1.2.1) is /apps/hive/warehouse > > 2017-11-16 16:13:58,098 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO sqlstd.SQLStdHiveAccessController: Created > SQLStdHiveAccessController for session context : HiveAuthzSessionContext > [sessionString=3Dbd69eb21-01c1-4dd3-b31c-16e065ab4101, clientType=3DHIVEC= LI] > > 2017-11-16 16:13:58,098 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration > hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.= DefaultMetaStoreFilterHookImpl > to org.apache.hadoop.hive.ql.security.authorization.plugin. > AuthorizationMetaStoreFilterHook > > 2017-11-16 16:13:58,098 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 2017-11-16 16:13:58,100 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Connected to metastore. > > 2017-11-16 16:13:58,139 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO state.StateStoreCoordinatorRef: Registered > StateStoreCoordinator endpoint > > 2017-11-16 16:13:58,143 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO execution.SparkSqlParser: Parsing command: > default.kylin_intermediate_test_sample_cube_d4ccd867_ > e0ae_4ec2_b2ff_fc5f1cc00dbb > > 2017-11-16 16:13:58,292 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 2017-11-16 16:13:58,294 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO hive.metastore: Connected to metastore. > > 2017-11-16 16:13:58,345 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,355 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,355 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,356 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,356 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,356 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,356 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: timesta= mp > > 2017-11-16 16:13:58,357 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,357 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,358 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,358 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double > > 2017-11-16 16:13:58,358 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double > > 2017-11-16 16:13:58,359 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,359 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,359 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,360 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,360 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,360 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,361 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,361 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,361 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,362 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,362 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 2017-11-16 16:13:58,362 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: boolean > > 2017-11-16 16:13:58,363 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,363 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,363 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,363 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:13:58,364 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 2017-11-16 16:14:00,368 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0 stored as > values in memory (estimated size 373.5 KB, free 365.9 MB) > > 2017-11-16 16:14:00,685 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0_piece0 store= d > as bytes in memory (estimated size 35.8 KB, free 365.9 MB) > > 2017-11-16 16:14:00,688 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:00 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 > in memory on 192.168.1.135:33164 (size: 35.8 KB, free: 366.3 MB) > > 2017-11-16 16:14:00,691 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:00 INFO spark.SparkContext: Created broadcast 0 from > > 2017-11-16 16:14:01,094 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_INCIDENT_ID/391cbd21-cc46-48f6-8531-47afa69bea83.dict > > 2017-11-16 16:14:01,106 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-9622-aeaf4fe878b6.dict > > 2017-11-16 16:14:01,111 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7-6190-4951-afd7-163137f3988e.dict > > 2017-11-16 16:14:01,115 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0-4074-9a6f-05977b6e3260.dict > > 2017-11-16 16:14:01,119 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8a7a-0e1e4ef7dcd3.dict > > 2017-11-16 16:14:01,122 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58-19b3-46cb-bcf7-64d1b7b10fe0.dict > > 2017-11-16 16:14:01,127 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9-4d28-81dc-fdac31787942.dict > > 2017-11-16 16:14:01,131 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a222-954e1c13b537.dict > > 2017-11-16 16:14:01,135 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386-68e1-4e82-8852-ab9bf2a6a114.dict > > 2017-11-16 16:14:01,139 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db12084dea3.dict > > 2017-11-16 16:14:01,142 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7bc1693c3f09.dict > > 2017-11-16 16:14:01,146 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_INCIDENT_DETAILS/d5316933-8229-46c0-bb82-fd0cf01bede5.dict > > 2017-11-16 16:14:01,149 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_INCIDENT_STATUS/4eff7287-a2f9-403c-9b0b-64a7b03f8f84.dict > > 2017-11-16 16:14:01,152 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5a-ab01-6e5ac7152672.dict > > 2017-11-16 16:14:01,156 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223ee929068d.dict > > 2017-11-16 16:14:01,159 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_POLICE_STATION_ID/ae15677f-29b9-4952-a7a5-c0119e3da826.dict > > 2017-11-16 16:14:01,164 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-93bc6786a0e8.dict > > 2017-11-16 16:14:01,167 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df-6e7511b0b57a.dict > > 2017-11-16 16:14:01,171 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_ > INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6-a2ff093aaf56.dict > > 2017-11-16 16:14:01,174 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641496.dict > > 2017-11-16 16:14:01,178 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_EVENT_TYPE/6dc5b112-399a-43cd-a8ed-e18a5a4eba5a.dict > > 2017-11-16 16:14:01,181 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-111fe71bd39d.dict > > 2017-11-16 16:14:01,184 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d.dict > > 2017-11-16 16:14:01,188 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92ad38ef16d0.dict > > 2017-11-16 16:14:01,191 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_INCIDENT_TIME_TO_COMPLETE/c4024726-85bb-484c-b5ca-4f1c2fb4dec0.dict > > 2017-11-16 16:14:01,195 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_ID/f593c063-e1d4-4da6-a092-4de55ee3ecbf.dict > > 2017-11-16 16:14:01,199 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208.dict > > 2017-11-16 16:14:01,202 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(29301960= 6) > loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT= / > IT_PRIORITY_ID/11208e17-c71d-42d0-b72e-696c131dbe2d.dict > > 2017-11-16 16:14:01,261 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO common.CubeStatsReader: Estimating size for layer = 0, > all cuboids are 536870911, total size is 0.010198831558227539 > > 2017-11-16 16:14:01,261 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO spark.SparkCubingByLayer: Partition for spark > cubing: 1 > > 2017-11-16 16:14:01,353 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO output.FileOutputCommitter: File Output Committer > Algorithm version is 1 > > 2017-11-16 16:14:01,422 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO spark.SparkContext: Starting job: runJob at > SparkHadoopMapReduceWriter.scala:88 > > 2017-11-16 16:14:01,543 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO mapred.FileInputFormat: Total input paths to proce= ss > : 1 > > 2017-11-16 16:14:01,623 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Registering RDD 6 (mapToPa= ir > at SparkCubingByLayer.java:170) > > 2017-11-16 16:14:01,627 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Got job 0 (runJob at > SparkHadoopMapReduceWriter.scala:88) with 1 output partitions > > 2017-11-16 16:14:01,628 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 > (runJob at SparkHadoopMapReduceWriter.scala:88) > > 2017-11-16 16:14:01,629 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Parents of final stage: > List(ShuffleMapStage 0) > > 2017-11-16 16:14:01,638 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Missing parents: > List(ShuffleMapStage 0) > > 2017-11-16 16:14:01,652 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage= 0 > (MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170), which > has no missing parents > > 2017-11-16 16:14:01,855 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1 stored as > values in memory (estimated size 25.8 KB, free 365.9 MB) > > 2017-11-16 16:14:01,892 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1_piece0 store= d > as bytes in memory (estimated size 10.7 KB, free 365.9 MB) > > 2017-11-16 16:14:01,894 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 > in memory on 192.168.1.135:33164 (size: 10.7 KB, free: 366.3 MB) > > 2017-11-16 16:14:01,896 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO spark.SparkContext: Created broadcast 1 from > broadcast at DAGScheduler.scala:1006 > > 2017-11-16 16:14:01,922 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting 1 missing tasks > from ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at > SparkCubingByLayer.java:170) (first 15 tasks are for partitions Vector(0)= ) > > 2017-11-16 16:14:01,924 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:01 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 > with 1 tasks > > 2017-11-16 16:14:02,015 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO scheduler.TaskSetManager: Starting task 0.0 in sta= ge > 0.0 (TID 0, localhost, executor driver, partition 0, ANY, 4978 bytes) > > 2017-11-16 16:14:02,033 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Running task 0.0 in stage 0.0 > (TID 0) > > 2017-11-16 16:14:02,044 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp > 1510829032977 > > 2017-11-16 16:14:02,159 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO client.TransportClientFactory: Successfully create= d > connection to /192.168.1.135:42799 after 64 ms (0 ms spent in bootstraps) > > 2017-11-16 16:14:02,179 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/metrics-core-2.2.0.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp5518529699147501519.tmp > > 2017-11-16 16:14:02,259 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/metrics-core-2.2.0.jar to > class loader > > 2017-11-16 16:14:02,260 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp 1510829032977 > > 2017-11-16 16:14:02,261 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/guava-12.0.1.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp2368368452706093062.tmp > > 2017-11-16 16:14:02,278 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/guava-12.0.1.jar to class > loader > > 2017-11-16 16:14:02,278 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with timestamp > 1510829032976 > > 2017-11-16 16:14:02,279 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/htrace-core-3.1.0-incubating.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp4539374910339958167.tmp > > 2017-11-16 16:14:02,295 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/htrace-core-3.1.0-incubati= ng.jar > to class loader > > 2017-11-16 16:14:02,295 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp 1510829032978 > > 2017-11-16 16:14:02,296 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/kylin-job-2.2.0.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp9086394010889635270.tmp > > 2017-11-16 16:14:02,418 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/kylin-job-2.2.0.jar to > class loader > > 2017-11-16 16:14:02,540 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO rdd.HadoopRDD: Input split: > hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/kylin_intermediate_test_ > sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc00dbb/000000_0:0+19534 > > 2017-11-16 16:14:02,569 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO zlib.ZlibFactory: Successfully loaded & initialize= d > native-zlib library > > 2017-11-16 16:14:02,570 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 2017-11-16 16:14:02,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 2017-11-16 16:14:02,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 2017-11-16 16:14:02,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 2017-11-16 16:14:03,035 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 251.01178= ms > > 2017-11-16 16:14:03,106 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 55.530064= ms > > 2017-11-16 16:14:03,148 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO common.AbstractHadoopJob: Ready to load KylinConfi= g > from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:03,170 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO cube.CubeManager: Initializing CubeManager with > config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:03,170 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO persistence.ResourceStore: Using metadata url > kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store > > 2017-11-16 16:14:03,190 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO hdfs.HDFSResourceStore: hdfs meta path : > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:03,194 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO cube.CubeManager: Loading Cube from folder > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube > > 2017-11-16 16:14:03,198 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO cube.CubeDescManager: Initializing CubeDescManager > with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:03,198 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO cube.CubeDescManager: Reloading Cube Metadata from > folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc > > 2017-11-16 16:14:03,206 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO project.ProjectManager: Initializing ProjectManage= r > with metadata url kylin_metadata@hdfs,path=3Dhdfs: > //trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867- > e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:03,213 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN cachesync.Broadcaster: More than one singleton exi= st > > 2017-11-16 16:14:03,213 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN project.ProjectManager: More than one singleton ex= ist > > 2017-11-16 16:14:03,232 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO metadata.MetadataManager: Reloading data model at > /model_desc/test_sample_model.json > > 2017-11-16 16:14:03,237 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN metadata.MetadataManager: More than one singleton > exist, current keys: 1464031233,1545268424 > > 2017-11-16 16:14:03,239 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO cube.CubeDescManager: Loaded 1 Cube(s) > > 2017-11-16 16:14:03,239 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN cube.CubeDescManager: More than one singleton exis= t > > 2017-11-16 16:14:03,239 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO cube.CubeManager: Reloaded cube test_sample_cube > being CUBE[name=3Dtest_sample_cube] having 1 segments > > 2017-11-16 16:14:03,239 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes > > 2017-11-16 16:14:03,239 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN cube.CubeManager: More than one singleton exist > > 2017-11-16 16:14:03,239 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN cube.CubeManager: type: class > org.apache.kylin.common.KylinConfig reference: 1464031233 > > 2017-11-16 16:14:03,239 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN cube.CubeManager: type: class > org.apache.kylin.common.KylinConfig reference: 1545268424 > > 2017-11-16 16:14:03,283 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 WARN dict.DictionaryManager: More than one singleton ex= ist > > 2017-11-16 16:14:03,283 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/ > 391cbd21-cc46-48f6-8531-47afa69bea83.dict > > 2017-11-16 16:14:03,287 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/ > 0c92e610-ce7f-4553-9622-aeaf4fe878b6.dict > > 2017-11-16 16:14:03,290 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/ > 19c299e7-6190-4951-afd7-163137f3988e.dict > > 2017-11-16 16:14:03,294 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/ > f8564625-7ec0-4074-9a6f-05977b6e3260.dict > > 2017-11-16 16:14:03,297 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/ > 739191fd-42e4-4685-8a7a-0e1e4ef7dcd3.dict > > 2017-11-16 16:14:03,300 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/ > 8a32cc58-19b3-46cb-bcf7-64d1b7b10fe0.dict > > 2017-11-16 16:14:03,303 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/ > cab12a4e-2ec9-4d28-81dc-fdac31787942.dict > > 2017-11-16 16:14:03,309 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/ > 44c5ee32-f62c-4d20-a222-954e1c13b537.dict > > 2017-11-16 16:14:03,315 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_ > DISPLAY/ab477386-68e1-4e82-8852-ab9bf2a6a114.dict > > 2017-11-16 16:14:03,321 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe- > 8b11-4fc1-9943-9db12084dea3.dict > > 2017-11-16 16:14:03,329 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390= - > 7bc1693c3f09.dict > > 2017-11-16 16:14:03,335 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/ > d5316933-8229-46c0-bb82-fd0cf01bede5.dict > > 2017-11-16 16:14:03,340 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/ > 4eff7287-a2f9-403c-9b0b-64a7b03f8f84.dict > > 2017-11-16 16:14:03,345 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260- > 4f5a-ab01-6e5ac7152672.dict > > 2017-11-16 16:14:03,351 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be- > 3710-44d4-a85e-223ee929068d.dict > > 2017-11-16 16:14:03,357 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ > ae15677f-29b9-4952-a7a5-c0119e3da826.dict > > 2017-11-16 16:14:03,363 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ > ab4a3960-ade3-4537-8198-93bc6786a0e8.dict > > 2017-11-16 16:14:03,369 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/ > 328f7642-2092-4d4c-83df-6e7511b0b57a.dict > > 2017-11-16 16:14:03,375 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/ > 2c7d1eea-8a55-412e-b7d6-a2ff093aaf56.dict > > 2017-11-16 16:14:03,381 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb- > 4b2d-8745-8af522641496.dict > > 2017-11-16 16:14:03,386 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a- > 43cd-a8ed-e18a5a4eba5a.dict > > 2017-11-16 16:14:03,391 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885- > 3299-4912-8570-111fe71bd39d.dict > > 2017-11-16 16:14:03,397 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661- > 9c34-f6dc6e6db62d.dict > > 2017-11-16 16:14:03,402 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393- > 9f8c-49c6-b640-92ad38ef16d0.dict > > 2017-11-16 16:14:03,408 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/ > c4024726-85bb-484c-b5ca-4f1c2fb4dec0.dict > > 2017-11-16 16:14:03,414 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092- > 4de55ee3ecbf.dict > > 2017-11-16 16:14:03,420 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c- > 8f05-9a43099c1208.dict > > 2017-11-16 16:14:03,425 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:03 INFO dict.DictionaryManager: > DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d- > 42d0-b72e-696c131dbe2d.dict > > 2017-11-16 16:14:04,031 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO executor.Executor: Finished task 0.0 in stage 0.0 > (TID 0). 1347 bytes result sent to driver > > 2017-11-16 16:14:04,072 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.TaskSetManager: Finished task 0.0 in sta= ge > 0.0 (TID 0) in 2084 ms on localhost (executor driver) (1/1) > > 2017-11-16 16:14:04,075 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, > whose tasks have all completed, from pool > > 2017-11-16 16:14:04,082 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (mapToPa= ir > at SparkCubingByLayer.java:170) finished in 2.118 s > > 2017-11-16 16:14:04,082 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: looking for newly runnable > stages > > 2017-11-16 16:14:04,083 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: running: Set() > > 2017-11-16 16:14:04,083 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1= ) > > 2017-11-16 16:14:04,084 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: failed: Set() > > 2017-11-16 16:14:04,088 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting ResultStage 1 > (MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:238), which > has no missing parents > > 2017-11-16 16:14:04,134 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2 stored as > values in memory (estimated size 82.3 KB, free 365.8 MB) > > 2017-11-16 16:14:04,153 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2_piece0 store= d > as bytes in memory (estimated size 31.6 KB, free 365.8 MB) > > 2017-11-16 16:14:04,154 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 > in memory on 192.168.1.135:33164 (size: 31.6 KB, free: 366.2 MB) > > 2017-11-16 16:14:04,155 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO spark.SparkContext: Created broadcast 2 from > broadcast at DAGScheduler.scala:1006 > > 2017-11-16 16:14:04,158 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting 1 missing tasks > from ResultStage 1 (MapPartitionsRDD[8] at mapToPair at > SparkCubingByLayer.java:238) (first 15 tasks are for partitions Vector(0)= ) > > 2017-11-16 16:14:04,158 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 > with 1 tasks > > 2017-11-16 16:14:04,160 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.TaskSetManager: Starting task 0.0 in sta= ge > 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 4621 bytes) > > 2017-11-16 16:14:04,160 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO executor.Executor: Running task 0.0 in stage 1.0 > (TID 1) > > 2017-11-16 16:14:04,204 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Getting 1 > non-empty blocks out of 1 blocks > > 2017-11-16 16:14:04,206 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Started 0 > remote fetches in 6 ms > > 2017-11-16 16:14:04,315 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO memory.MemoryStore: Block rdd_7_0 stored as bytes = in > memory (estimated size 49.2 KB, free 365.7 MB) > > 2017-11-16 16:14:04,315 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added rdd_7_0 in memory = on > 192.168.1.135:33164 (size: 49.2 KB, free: 366.2 MB) > > 2017-11-16 16:14:04,331 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer > Algorithm version is 1 > > 2017-11-16 16:14:04,334 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer > Algorithm version is 1 > > 2017-11-16 16:14:04,359 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO common.AbstractHadoopJob: Ready to load KylinConfi= g > from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:04,377 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO cube.CubeDescManager: Initializing CubeDescManager > with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:04,377 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO persistence.ResourceStore: Using metadata url > kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store > > 2017-11-16 16:14:04,393 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO hdfs.HDFSResourceStore: hdfs meta path : > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:04,394 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO cube.CubeDescManager: Reloading Cube Metadata from > folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc > > 2017-11-16 16:14:04,400 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO project.ProjectManager: Initializing ProjectManage= r > with metadata url kylin_metadata@hdfs,path=3Dhdfs: > //trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867- > e0ae-4ec2-b2ff-fc5f1cc00dbb > > 2017-11-16 16:14:04,406 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 WARN cachesync.Broadcaster: More than one singleton exi= st > > 2017-11-16 16:14:04,406 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 WARN project.ProjectManager: More than one singleton ex= ist > > 2017-11-16 16:14:04,423 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO metadata.MetadataManager: Reloading data model at > /model_desc/test_sample_model.json > > 2017-11-16 16:14:04,427 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 WARN metadata.MetadataManager: More than one singleton > exist, current keys: 1464031233,1545268424,1474775600 > > 2017-11-16 16:14:04,428 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO cube.CubeDescManager: Loaded 1 Cube(s) > > 2017-11-16 16:14:04,428 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 WARN cube.CubeDescManager: More than one singleton exis= t > > 2017-11-16 16:14:04,498 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO output.FileOutputCommitter: Saved output of task > 'attempt_20171116161401_0001_r_000000_0' to hdfs://trinitybdhdfs/kylin/ > kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-814206fb79e3/ > test_sample_cube/cuboid/level_base_cuboid/_temporary/0/task_ > 20171116161401_0001_r_000000 > > 2017-11-16 16:14:04,499 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO mapred.SparkHadoopMapRedUtil: > attempt_20171116161401_0001_r_000000_0: Committed > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 ERROR executor.Executor: Exception in task 0.0 in stage > 1.0 (TID 1) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo. > java:622) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > 2017-11-16 16:14:04,517 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:04,543 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 1= .0 > (TID 1, localhost, executor driver): java.lang.IllegalArgumentException: > Class is not registered: org.apache.spark.internal.io.FileCommitProtocol$ > TaskCommitMessage > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo. > java:622) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:04,544 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > > 2017-11-16 16:14:04,546 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 ERROR scheduler.TaskSetManager: Task 0 in stage 1.0 > failed 1 times; aborting job > > 2017-11-16 16:14:04,547 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, > whose tasks have all completed, from pool > > 2017-11-16 16:14:04,551 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1 > > 2017-11-16 16:14:04,552 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at > SparkHadoopMapReduceWriter.scala:88) failed in 0.393 s due to Job aborted > due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent > failure: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): > java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > 2017-11-16 16:14:04,552 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > 2017-11-16 16:14:04,552 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488) > > 2017-11-16 16:14:04,552 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > 2017-11-16 16:14:04,552 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517) > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo. > java:622) > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > > 2017-11-16 16:14:04,553 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Driver stacktrace: > > 2017-11-16 16:14:04,557 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: Job 0 failed: runJob at > SparkHadoopMapReduceWriter.scala:88, took 3.135125 s > > 2017-11-16 16:14:04,559 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 ERROR io.SparkHadoopMapReduceWriter: Aborting job > job_20171116161401_0008. > > 2017-11-16 16:14:04,559 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 > in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage > 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentExcepti= on: > Class is not registered: org.apache.spark.internal.io.FileCommitProtocol$ > TaskCommitMessage > > 2017-11-16 16:14:04,559 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > 2017-11-16 16:14:04,559 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488) > > 2017-11-16 16:14:04,559 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > 2017-11-16 16:14:04,559 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo. > java:622) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Driver stacktrace: > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$ > scheduler$DAGScheduler$$failJobAndIndependentStages( > DAGScheduler.scala:1499) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1487) > > 2017-11-16 16:14:04,560 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1486) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at scala.collection.mutable.ResizableArray$class.foreach( > ResizableArray.scala:59) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at scala.collection.mutable.ArrayBuffer.foreach( > ArrayBuffer.scala:48) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.abortStage( > DAGScheduler.scala:1486) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at scala.Option.foreach(Option.scala:257) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed( > DAGScheduler.scala:814) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop. > doOnReceive(DAGScheduler.scala:1714) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop. > onReceive(DAGScheduler.scala:1669) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop. > onReceive(DAGScheduler.scala:1658) > > 2017-11-16 16:14:04,561 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.util.EventLoop$$anon$1.run( > EventLoop.scala:48) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.runJob( > DAGScheduler.scala:630) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.internal.io.SparkHadoopMapReduceWriter$. > write(SparkHadoopMapReduceWriter.scala:88) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:151) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:112) > > 2017-11-16 16:14:04,562 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions. > saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1084) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.api.java.JavaPairRDD. > saveAsNewAPIHadoopDataset(JavaPairRDD.scala:831) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS( > SparkCubingByLayer.java:238) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.engine.spark.SparkCubingByLayer.execute( > SparkCubingByLayer.java:192) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.common.util.AbstractApplication.execute( > AbstractApplication.java:37) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.common.util.SparkEntry.main(SparkEntry. > java:44) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at sun.reflect.NativeMethodAccessorImpl.invoke( > NativeMethodAccessorImpl.java:62) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.reflect.Method.invoke(Method.java:498) > > 2017-11-16 16:14:04,563 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$ > deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1( > SparkSubmit.scala:180) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.submit( > SparkSubmit.scala:205) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit. > scala:119) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Caused by: java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo. > java:622) > > 2017-11-16 16:14:04,564 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > 2017-11-16 16:14:04,565 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > 2017-11-16 16:14:04,565 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > 2017-11-16 16:14:04,565 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > 2017-11-16 16:14:04,565 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:04,570 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Exception in thread "main" java.lang.RuntimeException: error execute > org.apache.kylin.engine.spark.SparkCubingByLayer > > 2017-11-16 16:14:04,570 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.common.util.AbstractApplication.execute( > AbstractApplication.java:42) > > 2017-11-16 16:14:04,570 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.common.util.SparkEntry.main(SparkEntry. > java:44) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at sun.reflect.NativeMethodAccessorImpl.invoke( > NativeMethodAccessorImpl.java:62) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.reflect.Method.invoke(Method.java:498) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$ > deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1( > SparkSubmit.scala:180) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.submit( > SparkSubmit.scala:205) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit. > scala:119) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Caused by: org.apache.spark.SparkException: Job aborted. > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.internal.io.SparkHadoopMapReduceWriter$. > write(SparkHadoopMapReduceWriter.scala:107) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:151) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:112) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) > > 2017-11-16 16:14:04,571 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.rdd.PairRDDFunctions. > saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1084) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.api.java.JavaPairRDD. > saveAsNewAPIHadoopDataset(JavaPairRDD.scala:831) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS( > SparkCubingByLayer.java:238) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.engine.spark.SparkCubingByLayer.execute( > SparkCubingByLayer.java:192) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.kylin.common.util.AbstractApplication.execute( > AbstractApplication.java:37) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > ... 10 more > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Caused by: org.apache.spark.SparkException: Job aborted due to stage > failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost ta= sk > 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalAr= gumentException: > Class is not registered: org.apache.spark.internal.io.FileCommitProtocol$ > TaskCommitMessage > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo. > java:622) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > 2017-11-16 16:14:04,572 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Driver stacktrace: > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$ > scheduler$DAGScheduler$$failJobAndIndependentStages( > DAGScheduler.scala:1499) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1487) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1486) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at scala.collection.mutable.ResizableArray$class.foreach( > ResizableArray.scala:59) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at scala.collection.mutable.ArrayBuffer.foreach( > ArrayBuffer.scala:48) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.abortStage( > DAGScheduler.scala:1486) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at scala.Option.foreach(Option.scala:257) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed( > DAGScheduler.scala:814) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop. > doOnReceive(DAGScheduler.scala:1714) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop. > onReceive(DAGScheduler.scala:1669) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop. > onReceive(DAGScheduler.scala:1658) > > 2017-11-16 16:14:04,573 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.util.EventLoop$$anon$1.run( > EventLoop.scala:48) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.scheduler.DAGScheduler.runJob( > DAGScheduler.scala:630) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.internal.io.SparkHadoopMapReduceWriter$. > write(SparkHadoopMapReduceWriter.scala:88) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > ... 21 more > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Caused by: java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo. > java:622) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > 2017-11-16 16:14:04,574 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:04,575 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO spark.SparkContext: Invoking stop() from shutdown > hook > > 2017-11-16 16:14:04,579 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO server.AbstractConnector: Stopped Spark@60bdda65 > {HTTP/1.1,[http/1.1]}{0.0.0.0:4040} > > 2017-11-16 16:14:04,581 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO ui.SparkUI: Stopped Spark web UI at > http://192.168.1.135:4040 > > 2017-11-16 16:14:04,636 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO spark.MapOutputTrackerMasterEndpoint: > MapOutputTrackerMasterEndpoint stopped! > > 2017-11-16 16:14:04,643 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO memory.MemoryStore: MemoryStore cleared > > 2017-11-16 16:14:04,644 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO storage.BlockManager: BlockManager stopped > > 2017-11-16 16:14:04,649 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO storage.BlockManagerMaster: BlockManagerMaster > stopped > > 2017-11-16 16:14:04,651 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO scheduler.OutputCommitCoordinator$ > OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! > > 2017-11-16 16:14:04,653 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO spark.SparkContext: Successfully stopped SparkCont= ext > > 2017-11-16 16:14:04,653 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO util.ShutdownHookManager: Shutdown hook called > > 2017-11-16 16:14:04,654 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : > 17/11/16 16:14:04 INFO util.ShutdownHookManager: Deleting directory > /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6 > > 2017-11-16 16:14:05,140 ERROR [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:156 : > error run spark job: > > java.io.IOException: OS command error exit with return code: 1, error > message: SparkEntry args:-className org.apache.kylin.engine.spark.SparkCu= bingByLayer > -hiveTable default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2= _b2ff_fc5f1cc00dbb > -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs > ,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb > -cubename test_sample_cube > > Abstract Application args:-hiveTable default.kylin_intermediate_ > test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc00dbb -output > hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs > ,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb > -cubename test_sample_cube > > 17/11/16 16:13:51 INFO spark.SparkContext: Running Spark version 2.2.0 > > 17/11/16 16:13:52 INFO spark.SparkContext: Submitted application: Cubing > for:test_sample_cube segment d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls to: hdfs > > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls to: hd= fs > > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls groups > to: > > 17/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls groups > to: > > 17/11/16 16:13:52 INFO spark.SecurityManager: SecurityManager: > authentication disabled; ui acls disabled; users with view permissions: > Set(hdfs); groups with view permissions: Set(); users with modify > permissions: Set(hdfs); groups with modify permissions: Set() > > 17/11/16 16:13:52 INFO util.Utils: Successfully started service > 'sparkDriver' on port 42799. > > 17/11/16 16:13:52 INFO spark.SparkEnv: Registering MapOutputTracker > > 17/11/16 16:13:52 INFO spark.SparkEnv: Registering BlockManagerMaster > > 17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: Using > org.apache.spark.storage.DefaultTopologyMapper for getting topology > information > > 17/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: > BlockManagerMasterEndpoint up > > 17/11/16 16:13:52 INFO storage.DiskBlockManager: Created local directory > at /tmp/blockmgr-b8d6ec0d-8a73-4ce6-9dbf-64002d5e2a62 > > 17/11/16 16:13:52 INFO memory.MemoryStore: MemoryStore started with > capacity 366.3 MB > > 17/11/16 16:13:52 INFO spark.SparkEnv: Registering OutputCommitCoordinato= r > > 17/11/16 16:13:52 INFO util.log: Logging initialized @2149ms > > 17/11/16 16:13:52 INFO server.Server: jetty-9.3.z-SNAPSHOT > > 17/11/16 16:13:52 INFO server.Server: Started @2235ms > > 17/11/16 16:13:52 INFO server.AbstractConnector: Started > ServerConnector@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} > > 17/11/16 16:13:52 INFO util.Utils: Successfully started service 'SparkUI' > on port 4040. > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@f5c79a6{/jobs,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1fc793c2{/jobs/json,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@329a1243{/jobs/job,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@27f9e982{/jobs/job/json,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@37d3d232{/stages,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@581d969c{/stages/json,null,AVAILABLE,@Spark= } > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2b46a8c1{/stages/stage,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@5851bd4f{/stages/stage/json, > null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2f40a43{/stages/pool,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@69c43e48{/stages/pool/json, > null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@3a80515c{/storage,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1c807b1d{/storage/json,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1b39fd82{/storage/rdd,null,AVAILABLE,@Spark= } > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@21680803{/storage/rdd/json, > null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@c8b96ec{/environment,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2d8f2f3a{/environment/json, > null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@7048f722{/executors,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@58a55449{/executors/json,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@6e0ff644{/executors/ > threadDump,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2a2bb0eb{/executors/threadDump/json,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2d0566ba{/static,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@29d2d081{/,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@58783f6c{/api,null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@88d6f9b{/jobs/job/kill,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@475b7792{/stages/stage/kill, > null,AVAILABLE,@Spark} > > 17/11/16 16:13:52 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started > at http://192.168.1.135:4040 > > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.jar at > spark://192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with > timestamp 1510829032976 > > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/hdp/2.4.3.0-227/hbase/lib/metrics-core-2.2.0.jar at spark:// > 192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp > 1510829032977 > > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/hdp/2.4.3.0-227/hbase/lib/guava-12.0.1.jar at spark:// > 192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp 1510829032977 > > 17/11/16 16:13:52 INFO spark.SparkContext: Added JAR > file:/usr/local/kylin/lib/kylin-job-2.2.0.jar at spark:// > 192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp 1510829032978 > > 17/11/16 16:13:53 INFO executor.Executor: Starting executor ID driver on > host localhost > > 17/11/16 16:13:53 INFO util.Utils: Successfully started service > 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33164. > > 17/11/16 16:13:53 INFO netty.NettyBlockTransferService: Server created on > 192.168.1.135:33164 > > 17/11/16 16:13:53 INFO storage.BlockManager: Using > org.apache.spark.storage.RandomBlockReplicationPolicy for block > replication policy > > 17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registering > BlockManager BlockManagerId(driver, 192.168.1.135, 33164, None) > > 17/11/16 16:13:53 INFO storage.BlockManagerMasterEndpoint: Registering > block manager 192.168.1.135:33164 with 366.3 MB RAM, > BlockManagerId(driver, 192.168.1.135, 33164, None) > > 17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registered BlockManage= r > BlockManagerId(driver, 192.168.1.135, 33164, None) > > 17/11/16 16:13:53 INFO storage.BlockManager: Initialized BlockManager: > BlockManagerId(driver, 192.168.1.135, 33164, None) > > 17/11/16 16:13:53 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@1b9c1b51{/metrics/json,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:54 INFO scheduler.EventLoggingListener: Logging events to > hdfs:///kylin/spark-history/local-1510829033012 > > 17/11/16 16:13:54 INFO common.AbstractHadoopJob: Ready to load KylinConfi= g > from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:13:54 INFO cube.CubeManager: Initializing CubeManager with > config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:13:54 INFO persistence.ResourceStore: Using metadata url > kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store > > 17/11/16 16:13:54 INFO hdfs.HDFSResourceStore: hdfs meta path : > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:13:54 INFO cube.CubeManager: Loading Cube from folder > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube > > 17/11/16 16:13:54 INFO cube.CubeDescManager: Initializing CubeDescManager > with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:13:54 INFO cube.CubeDescManager: Reloading Cube Metadata from > folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc > > 17/11/16 16:13:54 INFO project.ProjectManager: Initializing ProjectManage= r > with metadata url kylin_metadata@hdfs,path=3Dhdfs: > //trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867- > e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: Checking custom measur= e > types from kylin config > > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > COUNT_DISTINCT(hllc), class org.apache.kylin.measure.hllc. > HLLCMeasureType$Factory > > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > COUNT_DISTINCT(bitmap), class org.apache.kylin.measure. > bitmap.BitmapMeasureType$Factory > > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > TOP_N(topn), class org.apache.kylin.measure.topn.TopNMeasureType$Factory > > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering RAW(raw), > class org.apache.kylin.measure.raw.RawMeasureType$Factory > > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > EXTENDED_COLUMN(extendedcolumn), class org.apache.kylin.measure. > extendedcolumn.ExtendedColumnMeasureType$Factory > > 17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering > PERCENTILE(percentile), class org.apache.kylin.measure.percentile. > PercentileMeasureType$Factory > > 17/11/16 16:13:54 INFO metadata.MetadataManager: Reloading data model at > /model_desc/test_sample_model.json > > 17/11/16 16:13:54 INFO cube.CubeDescManager: Loaded 1 Cube(s) > > 17/11/16 16:13:54 INFO cube.CubeManager: Reloaded cube test_sample_cube > being CUBE[name=3Dtest_sample_cube] having 1 segments > > 17/11/16 16:13:54 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes > > 17/11/16 16:13:54 INFO spark.SparkCubingByLayer: RDD Output path: > hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ > > 17/11/16 16:13:55 INFO spark.SparkCubingByLayer: All measure are normal > (agg on all cuboids) ? : true > > 17/11/16 16:13:55 INFO internal.SharedState: loading hive config file: > file:/usr/local/spark/conf/hive-site.xml > > 17/11/16 16:13:55 INFO internal.SharedState: spark.sql.warehouse.dir is > not set, but hive.metastore.warehouse.dir is set. Setting > spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir > ('/apps/hive/warehouse'). > > 17/11/16 16:13:55 INFO internal.SharedState: Warehouse path is > '/apps/hive/warehouse'. > > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@75cf0de5{/SQL,null,AVAILABLE,@Spark} > > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@468173fa{/SQL/json,null,AVAILABLE,@Spark} > > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@27e2287c{/SQL/execution,null, > AVAILABLE,@Spark} > > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@2cd388f5{/SQL/execution/json, > null,AVAILABLE,@Spark} > > 17/11/16 16:13:55 INFO handler.ContextHandler: Started > o.s.j.s.ServletContextHandler@4207852d{/static/sql,null,AVAILABLE,@Spark} > > 17/11/16 16:13:56 INFO hive.HiveUtils: Initializing > HiveMetastoreConnection version 1.2.1 using Spark classes. > > 17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 17/11/16 16:13:57 INFO hive.metastore: Connected to metastore. > > 17/11/16 16:13:57 INFO session.SessionState: Created local directory: > /tmp/58660d8b-48ac-4cf0-bd06-6b96018a5482_resources > > 17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482 > > 17/11/16 16:13:57 INFO session.SessionState: Created local directory: > /tmp/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482 > > 17/11/16 16:13:57 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482/_tmp_space.db > > 17/11/16 16:13:57 INFO client.HiveClientImpl: Warehouse location for Hive > client (version 1.2.1) is /apps/hive/warehouse > > 17/11/16 16:13:57 INFO sqlstd.SQLStdHiveAccessController: Created > SQLStdHiveAccessController for session context : HiveAuthzSessionContext > [sessionString=3D58660d8b-48ac-4cf0-bd06-6b96018a5482, clientType=3DHIVEC= LI] > > 17/11/16 16:13:57 INFO hive.metastore: Mestastore configuration > hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.= DefaultMetaStoreFilterHookImpl > to org.apache.hadoop.hive.ql.security.authorization.plugin. > AuthorizationMetaStoreFilterHook > > 17/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 17/11/16 16:13:57 INFO hive.metastore: Connected to metastore. > > 17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration > hive.metastore.filter.hook changed from org.apache.hadoop.hive.ql. > security.authorization.plugin.AuthorizationMetaStoreFilterHook to > org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl > > 17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 17/11/16 16:13:58 INFO hive.metastore: Connected to metastore. > > 17/11/16 16:13:58 INFO session.SessionState: Created local directory: > /tmp/bd69eb21-01c1-4dd3-b31c-16e065ab4101_resources > > 17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101 > > 17/11/16 16:13:58 INFO session.SessionState: Created local directory: > /tmp/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101 > > 17/11/16 16:13:58 INFO session.SessionState: Created HDFS directory: > /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101/_tmp_space.db > > 17/11/16 16:13:58 INFO client.HiveClientImpl: Warehouse location for Hive > client (version 1.2.1) is /apps/hive/warehouse > > 17/11/16 16:13:58 INFO sqlstd.SQLStdHiveAccessController: Created > SQLStdHiveAccessController for session context : HiveAuthzSessionContext > [sessionString=3Dbd69eb21-01c1-4dd3-b31c-16e065ab4101, clientType=3DHIVEC= LI] > > 17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration > hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.= DefaultMetaStoreFilterHookImpl > to org.apache.hadoop.hive.ql.security.authorization.plugin. > AuthorizationMetaStoreFilterHook > > 17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 17/11/16 16:13:58 INFO hive.metastore: Connected to metastore. > > 17/11/16 16:13:58 INFO state.StateStoreCoordinatorRef: Registered > StateStoreCoordinator endpoint > > 17/11/16 16:13:58 INFO execution.SparkSqlParser: Parsing command: > default.kylin_intermediate_test_sample_cube_d4ccd867_ > e0ae_4ec2_b2ff_fc5f1cc00dbb > > 17/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore wit= h > URI thrift://master01.trinitymobility.local:9083 > > 17/11/16 16:13:58 INFO hive.metastore: Connected to metastore. > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: timesta= mp > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: boolean > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int > > 17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0 stored as > values in memory (estimated size 373.5 KB, free 365.9 MB) > > 17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0_piece0 store= d > as bytes in memory (estimated size 35.8 KB, free 365.9 MB) > > 17/11/16 16:14:00 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 > in memory on 192.168.1.135:33164 (size: 35.8 KB, free: 366.3 MB) > > 17/11/16 16:14:00 INFO spark.SparkContext: Created broadcast 0 from > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/ > 391cbd21-cc46-48f6-8531-47afa69bea83.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/ > 0c92e610-ce7f-4553-9622-aeaf4fe878b6.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/ > 19c299e7-6190-4951-afd7-163137f3988e.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/ > f8564625-7ec0-4074-9a6f-05977b6e3260.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/ > 739191fd-42e4-4685-8a7a-0e1e4ef7dcd3.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/ > 8a32cc58-19b3-46cb-bcf7-64d1b7b10fe0.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/ > cab12a4e-2ec9-4d28-81dc-fdac31787942.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/ > 44c5ee32-f62c-4d20-a222-954e1c13b537.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_ > DISPLAY/ab477386-68e1-4e82-8852-ab9bf2a6a114.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe- > 8b11-4fc1-9943-9db12084dea3.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390= - > 7bc1693c3f09.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/ > d5316933-8229-46c0-bb82-fd0cf01bede5.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/ > 4eff7287-a2f9-403c-9b0b-64a7b03f8f84.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260- > 4f5a-ab01-6e5ac7152672.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be- > 3710-44d4-a85e-223ee929068d.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ > ae15677f-29b9-4952-a7a5-c0119e3da826.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ > ab4a3960-ade3-4537-8198-93bc6786a0e8.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/ > 328f7642-2092-4d4c-83df-6e7511b0b57a.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/ > 2c7d1eea-8a55-412e-b7d6-a2ff093aaf56.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb- > 4b2d-8745-8af522641496.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a- > 43cd-a8ed-e18a5a4eba5a.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885- > 3299-4912-8570-111fe71bd39d.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661- > 9c34-f6dc6e6db62d.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393- > 9f8c-49c6-b640-92ad38ef16d0.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/ > c4024726-85bb-484c-b5ca-4f1c2fb4dec0.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092- > 4de55ee3ecbf.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c- > 8f05-9a43099c1208.dict > > 17/11/16 16:14:01 INFO dict.DictionaryManager: > DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d- > 42d0-b72e-696c131dbe2d.dict > > 17/11/16 16:14:01 INFO common.CubeStatsReader: Estimating size for layer > 0, all cuboids are 536870911, total size is 0.010198831558227539 > > 17/11/16 16:14:01 INFO spark.SparkCubingByLayer: Partition for spark > cubing: 1 > > 17/11/16 16:14:01 INFO output.FileOutputCommitter: File Output Committer > Algorithm version is 1 > > 17/11/16 16:14:01 INFO spark.SparkContext: Starting job: runJob at > SparkHadoopMapReduceWriter.scala:88 > > 17/11/16 16:14:01 INFO mapred.FileInputFormat: Total input paths to > process : 1 > > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Registering RDD 6 > (mapToPair at SparkCubingByLayer.java:170) > > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Got job 0 (runJob at > SparkHadoopMapReduceWriter.scala:88) with 1 output partitions > > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 > (runJob at SparkHadoopMapReduceWriter.scala:88) > > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Parents of final stage: > List(ShuffleMapStage 0) > > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Missing parents: > List(ShuffleMapStage 0) > > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage > 0 (MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170), whic= h > has no missing parents > > 17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1 stored as > values in memory (estimated size 25.8 KB, free 365.9 MB) > > 17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1_piece0 store= d > as bytes in memory (estimated size 10.7 KB, free 365.9 MB) > > 17/11/16 16:14:01 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 > in memory on 192.168.1.135:33164 (size: 10.7 KB, free: 366.3 MB) > > 17/11/16 16:14:01 INFO spark.SparkContext: Created broadcast 1 from > broadcast at DAGScheduler.scala:1006 > > 17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting 1 missing tasks > from ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at > SparkCubingByLayer.java:170) (first 15 tasks are for partitions Vector(0)= ) > > 17/11/16 16:14:01 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 > with 1 tasks > > 17/11/16 16:14:02 INFO scheduler.TaskSetManager: Starting task 0.0 in > stage 0.0 (TID 0, localhost, executor driver, partition 0, ANY, 4978 byte= s) > > 17/11/16 16:14:02 INFO executor.Executor: Running task 0.0 in stage 0.0 > (TID 0) > > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp > 1510829032977 > > 17/11/16 16:14:02 INFO client.TransportClientFactory: Successfully create= d > connection to /192.168.1.135:42799 after 64 ms (0 ms spent in bootstraps) > > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/metrics-core-2.2.0.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp5518529699147501519.tmp > > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/metrics-core-2.2.0.jar to > class loader > > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/guava-12.0.1.jar with timestamp 1510829032977 > > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/guava-12.0.1.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp2368368452706093062.tmp > > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/guava-12.0.1.jar to class > loader > > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with timestamp > 1510829032976 > > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/htrace-core-3.1.0-incubating.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp4539374910339958167.tmp > > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/htrace-core-3.1.0-incubati= ng.jar > to class loader > > 17/11/16 16:14:02 INFO executor.Executor: Fetching spark:// > 192.168.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp 1510829032978 > > 17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/ > jars/kylin-job-2.2.0.jar to /tmp/spark-1baf8c03-622c-4406- > 9dd6-13db862ef4b6/userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/ > fetchFileTemp9086394010889635270.tmp > > 17/11/16 16:14:02 INFO executor.Executor: Adding > file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/ > userFiles-d0f7729a-5561-48d1-bfd5-e3459b0dc20e/kylin-job-2.2.0.jar to > class loader > > 17/11/16 16:14:02 INFO rdd.HadoopRDD: Input split: > hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/kylin_intermediate_test_ > sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc00dbb/000000_0:0+19534 > > 17/11/16 16:14:02 INFO zlib.ZlibFactory: Successfully loaded & initialize= d > native-zlib library > > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 17/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor > [.deflate] > > 17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 251.01178 > ms > > 17/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 55.530064 > ms > > 17/11/16 16:14:03 INFO common.AbstractHadoopJob: Ready to load KylinConfi= g > from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:03 INFO cube.CubeManager: Initializing CubeManager with > config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:03 INFO persistence.ResourceStore: Using metadata url > kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store > > 17/11/16 16:14:03 INFO hdfs.HDFSResourceStore: hdfs meta path : > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:03 INFO cube.CubeManager: Loading Cube from folder > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube > > 17/11/16 16:14:03 INFO cube.CubeDescManager: Initializing CubeDescManager > with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:03 INFO cube.CubeDescManager: Reloading Cube Metadata from > folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc > > 17/11/16 16:14:03 INFO project.ProjectManager: Initializing ProjectManage= r > with metadata url kylin_metadata@hdfs,path=3Dhdfs: > //trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867- > e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:03 WARN cachesync.Broadcaster: More than one singleton exi= st > > 17/11/16 16:14:03 WARN project.ProjectManager: More than one singleton > exist > > 17/11/16 16:14:03 INFO metadata.MetadataManager: Reloading data model at > /model_desc/test_sample_model.json > > 17/11/16 16:14:03 WARN metadata.MetadataManager: More than one singleton > exist, current keys: 1464031233,1545268424 > > 17/11/16 16:14:03 INFO cube.CubeDescManager: Loaded 1 Cube(s) > > 17/11/16 16:14:03 WARN cube.CubeDescManager: More than one singleton exis= t > > 17/11/16 16:14:03 INFO cube.CubeManager: Reloaded cube test_sample_cube > being CUBE[name=3Dtest_sample_cube] having 1 segments > > 17/11/16 16:14:03 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes > > 17/11/16 16:14:03 WARN cube.CubeManager: More than one singleton exist > > 17/11/16 16:14:03 WARN cube.CubeManager: type: class > org.apache.kylin.common.KylinConfig reference: 1464031233 > > 17/11/16 16:14:03 WARN cube.CubeManager: type: class > org.apache.kylin.common.KylinConfig reference: 1545268424 > > 17/11/16 16:14:03 WARN dict.DictionaryManager: More than one singleton > exist > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/ > 391cbd21-cc46-48f6-8531-47afa69bea83.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/ > 0c92e610-ce7f-4553-9622-aeaf4fe878b6.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/ > 19c299e7-6190-4951-afd7-163137f3988e.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/ > f8564625-7ec0-4074-9a6f-05977b6e3260.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/ > 739191fd-42e4-4685-8a7a-0e1e4ef7dcd3.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/ > 8a32cc58-19b3-46cb-bcf7-64d1b7b10fe0.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/ > cab12a4e-2ec9-4d28-81dc-fdac31787942.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/ > 44c5ee32-f62c-4d20-a222-954e1c13b537.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_ > DISPLAY/ab477386-68e1-4e82-8852-ab9bf2a6a114.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe- > 8b11-4fc1-9943-9db12084dea3.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390= - > 7bc1693c3f09.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/ > d5316933-8229-46c0-bb82-fd0cf01bede5.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/ > 4eff7287-a2f9-403c-9b0b-64a7b03f8f84.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260- > 4f5a-ab01-6e5ac7152672.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be- > 3710-44d4-a85e-223ee929068d.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ > ae15677f-29b9-4952-a7a5-c0119e3da826.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ > ab4a3960-ade3-4537-8198-93bc6786a0e8.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/ > 328f7642-2092-4d4c-83df-6e7511b0b57a.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/ > 2c7d1eea-8a55-412e-b7d6-a2ff093aaf56.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb- > 4b2d-8745-8af522641496.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a- > 43cd-a8ed-e18a5a4eba5a.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885- > 3299-4912-8570-111fe71bd39d.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661- > 9c34-f6dc6e6db62d.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393- > 9f8c-49c6-b640-92ad38ef16d0.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/ > c4024726-85bb-484c-b5ca-4f1c2fb4dec0.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092- > 4de55ee3ecbf.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c- > 8f05-9a43099c1208.dict > > 17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager > (1001935557 <1001935557>) loading DictionaryInfo(loadDictObj:true) at > /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d- > 42d0-b72e-696c131dbe2d.dict > > 17/11/16 16:14:04 INFO executor.Executor: Finished task 0.0 in stage 0.0 > (TID 0). 1347 bytes result sent to driver > > 17/11/16 16:14:04 INFO scheduler.TaskSetManager: Finished task 0.0 in > stage 0.0 (TID 0) in 2084 ms on localhost (executor driver) (1/1) > > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, > whose tasks have all completed, from pool > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: ShuffleMapStage 0 > (mapToPair at SparkCubingByLayer.java:170) finished in 2.118 s > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: looking for newly runnable > stages > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: running: Set() > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1= ) > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: failed: Set() > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting ResultStage 1 > (MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:238), which > has no missing parents > > 17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2 stored as > values in memory (estimated size 82.3 KB, free 365.8 MB) > > 17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2_piece0 store= d > as bytes in memory (estimated size 31.6 KB, free 365.8 MB) > > 17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 > in memory on 192.168.1.135:33164 (size: 31.6 KB, free: 366.2 MB) > > 17/11/16 16:14:04 INFO spark.SparkContext: Created broadcast 2 from > broadcast at DAGScheduler.scala:1006 > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting 1 missing tasks > from ResultStage 1 (MapPartitionsRDD[8] at mapToPair at > SparkCubingByLayer.java:238) (first 15 tasks are for partitions Vector(0)= ) > > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 > with 1 tasks > > 17/11/16 16:14:04 INFO scheduler.TaskSetManager: Starting task 0.0 in > stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 4621 byte= s) > > 17/11/16 16:14:04 INFO executor.Executor: Running task 0.0 in stage 1.0 > (TID 1) > > 17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Getting 1 > non-empty blocks out of 1 blocks > > 17/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Started 0 > remote fetches in 6 ms > > 17/11/16 16:14:04 INFO memory.MemoryStore: Block rdd_7_0 stored as bytes > in memory (estimated size 49.2 KB, free 365.7 MB) > > 17/11/16 16:14:04 INFO storage.BlockManagerInfo: Added rdd_7_0 in memory > on 192.168.1.135:33164 (size: 49.2 KB, free: 366.2 MB) > > 17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer > Algorithm version is 1 > > 17/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer > Algorithm version is 1 > > 17/11/16 16:14:04 INFO common.AbstractHadoopJob: Ready to load KylinConfi= g > from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:04 INFO cube.CubeDescManager: Initializing CubeDescManager > with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:04 INFO persistence.ResourceStore: Using metadata url > kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_ > metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store > > 17/11/16 16:14:04 INFO hdfs.HDFSResourceStore: hdfs meta path : > hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:04 INFO cube.CubeDescManager: Reloading Cube Metadata from > folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/ > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc > > 17/11/16 16:14:04 INFO project.ProjectManager: Initializing ProjectManage= r > with metadata url kylin_metadata@hdfs,path=3Dhdfs: > //trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867- > e0ae-4ec2-b2ff-fc5f1cc00dbb > > 17/11/16 16:14:04 WARN cachesync.Broadcaster: More than one singleton exi= st > > 17/11/16 16:14:04 WARN project.ProjectManager: More than one singleton > exist > > 17/11/16 16:14:04 INFO metadata.MetadataManager: Reloading data model at > /model_desc/test_sample_model.json > > 17/11/16 16:14:04 WARN metadata.MetadataManager: More than one singleton > exist, current keys: 1464031233,1545268424,1474775600 > > 17/11/16 16:14:04 INFO cube.CubeDescManager: Loaded 1 Cube(s) > > 17/11/16 16:14:04 WARN cube.CubeDescManager: More than one singleton exis= t > > 17/11/16 16:14:04 INFO output.FileOutputCommitter: Saved output of task > 'attempt_20171116161401_0001_r_000000_0' to hdfs://trinitybdhdfs/kylin/ > kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-814206fb79e3/ > test_sample_cube/cuboid/level_base_cuboid/_temporary/0/task_ > 20171116161401_0001_r_000000 > > 17/11/16 16:14:04 INFO mapred.SparkHadoopMapRedUtil: > attempt_20171116161401_0001_r_000000_0: Committed > > 17/11/16 16:14:04 ERROR executor.Executor: Exception in task 0.0 in stage > 1.0 (TID 1) > > java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo. > java:488) > > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > at com.esotericsoftware.kryo. > Kryo.writeClass(Kryo.java:517) > > at com.esotericsoftware.kryo. > Kryo.writeClassAndObject(Kryo.java:622) > > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > 17/11/16 16:14:04 WARN scheduler.TaskSetManager: Lost task 0.0 in stage > 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentExcepti= on: > Class is not registered: org.apache.spark.internal.io.FileCommitProtocol$ > TaskCommitMessage > > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo. > java:488) > > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > at com.esotericsoftware.kryo. > Kryo.writeClass(Kryo.java:517) > > at com.esotericsoftware.kryo. > Kryo.writeClassAndObject(Kryo.java:622) > > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > > 17/11/16 16:14:04 ERROR scheduler.TaskSetManager: Task 0 in stage 1.0 > failed 1 times; aborting job > > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, > whose tasks have all completed, from pool > > 17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1 > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at > SparkHadoopMapReduceWriter.scala:88) failed in 0.393 s due to Job aborted > due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent > failure: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): > java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo. > java:488) > > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > at com.esotericsoftware.kryo. > Kryo.writeClass(Kryo.java:517) > > at com.esotericsoftware.kryo. > Kryo.writeClassAndObject(Kryo.java:622) > > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > > Driver stacktrace: > > 17/11/16 16:14:04 INFO scheduler.DAGScheduler: Job 0 failed: runJob at > SparkHadoopMapReduceWriter.scala:88, took 3.135125 s > > 17/11/16 16:14:04 ERROR io.SparkHadoopMapReduceWriter: Aborting job > job_20171116161401_0008. > > org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 > in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage > 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentExcepti= on: > Class is not registered: org.apache.spark.internal.io.FileCommitProtocol$ > TaskCommitMessage > > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo. > java:488) > > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > at com.esotericsoftware.kryo. > Kryo.writeClass(Kryo.java:517) > > at com.esotericsoftware.kryo. > Kryo.writeClassAndObject(Kryo.java:622) > > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > > Driver stacktrace: > > at org.apache.spark.scheduler.DAGScheduler.org > $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages( > DAGScheduler.scala:1499) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1487) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1486) > > at scala.collection.mutable.ResizableArray$class.foreach( > ResizableArray.scala:59) > > at scala.collection.mutable.ArrayBuffer.foreach( > ArrayBuffer.scala:48) > > at org.apache.spark.scheduler.DAGScheduler.abortStage( > DAGScheduler.scala:1486) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > at scala.Option.foreach(Option.scala:257) > > at org.apache.spark.scheduler.DAGScheduler. > handleTaskSetFailed(DAGScheduler.scala:814) > > at org.apache.spark.scheduler. > DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1714) > > at org.apache.spark.scheduler. > DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1669) > > at org.apache.spark.scheduler. > DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1658) > > at org.apache.spark.util.EventLoop$$anon$1.run( > EventLoop.scala:48) > > at org.apache.spark.scheduler.DAGScheduler.runJob( > DAGScheduler.scala:630) > > at org.apache.spark.SparkContext. > runJob(SparkContext.scala:2022) > > at org.apache.spark.SparkContext. > runJob(SparkContext.scala:2043) > > at org.apache.spark.SparkContext. > runJob(SparkContext.scala:2075) > > at org.apache.spark.internal.io. > SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduceWriter.scala:88) > > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085) > > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:151) > > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:112) > > at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) > > at org.apache.spark.rdd.PairRDDFunctions. > saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1084) > > at org.apache.spark.api.java.JavaPairRDD. > saveAsNewAPIHadoopDataset(JavaPairRDD.scala:831) > > at org.apache.kylin.engine.spark. > SparkCubingByLayer.saveToHDFS(SparkCubingByLayer.java:238) > > at org.apache.kylin.engine.spark. > SparkCubingByLayer.execute(SparkCubingByLayer.java:192) > > at org.apache.kylin.common.util. > AbstractApplication.execute(AbstractApplication.java:37) > > at org.apache.kylin.common.util. > SparkEntry.main(SparkEntry.java:44) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > > at sun.reflect.NativeMethodAccessorImpl.invoke( > NativeMethodAccessorImpl.java:62) > > at sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:498) > > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$ > deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) > > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1( > SparkSubmit.scala:180) > > at org.apache.spark.deploy.SparkSubmit$.submit( > SparkSubmit.scala:205) > > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit. > scala:119) > > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit. > scala) > > Caused by: java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo. > java:488) > > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > at com.esotericsoftware.kryo. > Kryo.writeClass(Kryo.java:517) > > at com.esotericsoftware.kryo. > Kryo.writeClassAndObject(Kryo.java:622) > > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > Exception in thread "main" java.lang.RuntimeException: error execute > org.apache.kylin.engine.spark.SparkCubingByLayer > > at org.apache.kylin.common.util. > AbstractApplication.execute(AbstractApplication.java:42) > > at org.apache.kylin.common.util. > SparkEntry.main(SparkEntry.java:44) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > > at sun.reflect.NativeMethodAccessorImpl.invoke( > NativeMethodAccessorImpl.java:62) > > at sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:498) > > at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$ > deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) > > at org.apache.spark.deploy.SparkSubmit$.doRunMain$1( > SparkSubmit.scala:180) > > at org.apache.spark.deploy.SparkSubmit$.submit( > SparkSubmit.scala:205) > > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit. > scala:119) > > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit. > scala) > > Caused by: org.apache.spark.SparkException: Job aborted. > > at org.apache.spark.internal.io. > SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduceWriter.scala:107) > > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085) > > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > at org.apache.spark.rdd.PairRDDFunctions$$anonfun$ > saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:1085) > > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:151) > > at org.apache.spark.rdd.RDDOperationScope$.withScope( > RDDOperationScope.scala:112) > > at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) > > at org.apache.spark.rdd.PairRDDFunctions. > saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1084) > > at org.apache.spark.api.java.JavaPairRDD. > saveAsNewAPIHadoopDataset(JavaPairRDD.scala:831) > > at org.apache.kylin.engine.spark. > SparkCubingByLayer.saveToHDFS(SparkCubingByLayer.java:238) > > at org.apache.kylin.engine.spark. > SparkCubingByLayer.execute(SparkCubingByLayer.java:192) > > at org.apache.kylin.common.util. > AbstractApplication.execute(AbstractApplication.java:37) > > ... 10 more > > Caused by: org.apache.spark.SparkException: Job aborted due to stage > failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost ta= sk > 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalAr= gumentException: > Class is not registered: org.apache.spark.internal.io.FileCommitProtocol$ > TaskCommitMessage > > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo. > java:488) > > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > at com.esotericsoftware.kryo. > Kryo.writeClass(Kryo.java:517) > > at com.esotericsoftware.kryo. > Kryo.writeClassAndObject(Kryo.java:622) > > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > > > Driver stacktrace: > > at org.apache.spark.scheduler.DAGScheduler.org > $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages( > DAGScheduler.scala:1499) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1487) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > abortStage$1.apply(DAGScheduler.scala:1486) > > at scala.collection.mutable.ResizableArray$class.foreach( > ResizableArray.scala:59) > > at scala.collection.mutable.ArrayBuffer.foreach( > ArrayBuffer.scala:48) > > at org.apache.spark.scheduler.DAGScheduler.abortStage( > DAGScheduler.scala:1486) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > at org.apache.spark.scheduler.DAGScheduler$$anonfun$ > handleTaskSetFailed$1.apply(DAGScheduler.scala:814) > > at scala.Option.foreach(Option.scala:257) > > at org.apache.spark.scheduler.DAGScheduler. > handleTaskSetFailed(DAGScheduler.scala:814) > > at org.apache.spark.scheduler. > DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1714) > > at org.apache.spark.scheduler. > DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1669) > > at org.apache.spark.scheduler. > DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1658) > > at org.apache.spark.util.EventLoop$$anon$1.run( > EventLoop.scala:48) > > at org.apache.spark.scheduler.DAGScheduler.runJob( > DAGScheduler.scala:630) > > at org.apache.spark.SparkContext. > runJob(SparkContext.scala:2022) > > at org.apache.spark.SparkContext. > runJob(SparkContext.scala:2043) > > at org.apache.spark.SparkContext. > runJob(SparkContext.scala:2075) > > at org.apache.spark.internal.io. > SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduceWriter.scala:88) > > ... 21 more > > Caused by: java.lang.IllegalArgumentException: Class is not registered: > org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage > > Note: To register this class use: kryo.register(org.apache. > spark.internal.io.FileCommitProtocol$TaskCommitMessage.class); > > at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo. > java:488) > > at com.esotericsoftware.kryo.util.DefaultClassResolver. > writeClass(DefaultClassResolver.java:97) > > at com.esotericsoftware.kryo. > Kryo.writeClass(Kryo.java:517) > > at com.esotericsoftware.kryo. > Kryo.writeClassAndObject(Kryo.java:622) > > at org.apache.spark.serializer.KryoSerializerInstance. > serialize(KryoSerializer.scala:315) > > at org.apache.spark.executor.Executor$TaskRunner.run( > Executor.scala:383) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > 17/11/16 16:14:04 INFO spark.SparkContext: Invoking stop() from shutdown > hook > > 17/11/16 16:14:04 INFO server.AbstractConnector: Stopped Spark@60bdda65 > {HTTP/1.1,[http/1.1]}{0.0.0.0:4040} > > 17/11/16 16:14:04 INFO ui.SparkUI: Stopped Spark web UI at > http://192.168.1.135:4040 > > 17/11/16 16:14:04 INFO spark.MapOutputTrackerMasterEndpoint: > MapOutputTrackerMasterEndpoint stopped! > > 17/11/16 16:14:04 INFO memory.MemoryStore: MemoryStore cleared > > 17/11/16 16:14:04 INFO storage.BlockManager: BlockManager stopped > > 17/11/16 16:14:04 INFO storage.BlockManagerMaster: BlockManagerMaster > stopped > > 17/11/16 16:14:04 INFO scheduler.OutputCommitCoordinator$ > OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! > > 17/11/16 16:14:04 INFO spark.SparkContext: Successfully stopped > SparkContext > > 17/11/16 16:14:04 INFO util.ShutdownHookManager: Shutdown hook called > > 17/11/16 16:14:04 INFO util.ShutdownHookManager: Deleting directory > /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6 > > The command is: > > export HADOOP_CONF_DIR=3D/usr/local/kylin/hadoop-conf && > /usr/local/kylin/spark/bin/spark-submit --class > org.apache.kylin.common.util.SparkEntry --conf > spark.executor.instances=3D1 --conf spark.yarn.archive=3Dhdfs:// > trinitybdhdfs/kylin/spark/spark-libs.jar --conf > spark.yarn.queue=3Ddefault --conf spark.yarn.am.extraJavaOptions=3D-Dhdp= .version=3D2.4.3.0-227 > --conf spark.history.fs.logDirectory=3Dhdfs:///kylin/spark-history --con= f > spark.driver.extraJavaOptions=3D-Dhdp.version=3D2.4.3.0-227 --conf > spark.master=3Dlocal[*] --conf spark.executor.extraJavaOptions=3D-Dhdp.v= ersion=3D2.4.3.0-227 > --conf spark.hadoop.yarn.timeline-service.enabled=3Dfalse --conf > spark.executor.memory=3D1G --conf spark.eventLog.enabled=3Dtrue --conf > spark.eventLog.dir=3Dhdfs:///kylin/spark-history --conf > spark.executor.cores=3D2 --jars /usr/hdp/2.4.3.0-227/hbase/ > lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.3. > 0-227/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.3.0- > 227/hbase/lib/guava-12.0.1.jar, /usr/local/kylin/lib/kylin-job-2.2.0.jar > -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable > default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1= cc00dbb > -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2- > 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId > d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs > ,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb > -cubename test_sample_cube > > at org.apache.kylin.common.util. > CliCommandExecutor.execute(CliCommandExecutor.java:92) > > at org.apache.kylin.engine.spark.SparkExecutable.doWork( > SparkExecutable.java:152) > > at org.apache.kylin.job.execution.AbstractExecutable. > execute(AbstractExecutable.java:125) > > at org.apache.kylin.job.execution. > DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:64) > > at org.apache.kylin.job.execution.AbstractExecutable. > execute(AbstractExecutable.java:125) > > at org.apache.kylin.job.impl.threadpool.DefaultScheduler$ > JobRunner.run(DefaultScheduler.java:144) > > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1142) > > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:617) > > at java.lang.Thread.run(Thread.java:745) > > 2017-11-16 16:14:05,169 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 > : job id:26342fa2-68ac-48e4-9eea-814206fb79e3-06 from RUNNING to ERROR > > 2017-11-16 16:14:05,217 INFO [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:421 > : job id:26342fa2-68ac-48e4-9eea-814206fb79e3 from RUNNING to ERROR > > 2017-11-16 16:14:05,217 DEBUG [Scheduler 1211098754 Job > 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable:25= 9 > : no need to send email, user list is empty > > 2017-11-16 16:14:05,226 INFO [pool-8-thread-1] > threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 0 actual > running, 0 stopped, 0 ready, 3 already succeed, 1 error, 1 discarded, 0 > others > > 2017-11-16 16:14:16,344 INFO [pool-8-thread-1] > threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 0 actual > running, 0 stopped, 0 ready, 3 already succeed, 1 error, 1 discarded, 0 > others > > > > > --=20 Best regards, Shaofeng Shi =E5=8F=B2=E5=B0=91=E9=94=8B --94eb2c1945161ca3c7055e199213 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
Are you running Spark 2.2? Kylin 2.2 supports Spark 2.1.1;= Please use the embedded=C2=A0Spark by setting SPARK_HOME to KYLIN_HOME/spa= rk.=C2=A0

20= 17-11-16 19:33 GMT+08:00 Prasanna <prasanna.p@trinitymobility= .com>:

Hi all,

=C2=A0

I installed 2.2.0 by following h= ttp://kylin.apache.org/docs21/tutorial/cube_spark.html .Kylin service is started successfully. I tried to build kylin cube on spark engine, but its failed at 7th st= ep build cube with spark engine. Please suggest me how to solve this problem.Therse are my logs. Please suggest me its high priority for me.<= /u>

=C2=A0

=C2=A0

2017-11-16 16:13:46,345 INFO=C2=A0 [pool-8-thread-1] threadpool.DefaultScheduler:113 : CubingJob{id=3D26342fa2-68ac-48e4-9eea-814206fb79e3, name=3DBUILD CUBE= - test_sample_cube - 20160101120000_20171114140000 - GMT+08:00 2017-11-15 21:02:27, state=3DREADY} prepare to schedule

2017-11-16 16:13:46,346 INFO=C2=A0 [pool-8-thread-1] threadpool.DefaultScheduler:116 : CubingJob{id=3D26342fa2-68ac-48e4-9eea-814206fb79e3, name=3DBUILD CUBE= - test_sample_cube - 20160101120000_20171114140000 - GMT+08:00 2017-11-15 21:02:27, state=3DREADY} scheduled

2017-11-16 16:13:46,346 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable= :111 : Executing AbstractExecutable (BUILD CUBE - test_sample_cube - 20160101120000_20171114140000 - GMT+08:00 2017-11-15 21:02:27)

2017-11-16 16:13:46,349 INFO=C2=A0 [pool-8-thread-1] threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 1 act= ual running, 0 stopped, 1 ready, 3 already succeed, 0 error, 1 discarded, 0 oth= ers

2017-11-16 16:13:46,360 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:= 421 : job id:26342fa2-68ac-48e4-9eea-814206fb79e3 from READY to RUNNING

2017-11-16 16:13:46,373 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable= :111 : Executing AbstractExecutable (Build Cube with Spark)

2017-11-16 16:13:46,385 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:= 421 : job id:26342fa2-68ac-48e4-9eea-814206fb79e3-06 from READY to RUNNING

2017-11-16 16:13:46,399 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:76 : SPARK_HOME was set to /usr/local/kylin/spark

2017-11-16 16:13:46,399 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:120 : = Using /usr/local/kylin/hadoop-conf as HADOOP_CONF_DIR

2017-11-16 16:13:46,900 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:162 := Kylin Config was updated with kylin.metadata.url : /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta548374980908023= 1586/meta

2017-11-16 16:13:46,901 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79= : Using metadata url /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta548374980908023= 1586/meta for resource store

2017-11-16 16:13:47,038 WARN=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] util.HeapMemorySizeUtil:55 : hbase.regionserver.global.memstore.upperLimit is deprecated by hbase.regionserver.global.memstore.size

2017-11-16 16:13:47,103 DEBUG [Scheduler 1211098754 = Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.JobRelatedMetaUtil:70= : Dump resources to /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta548374980908023= 1586/meta took 203 ms

2017-11-16 16:13:47,105 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:162 := Kylin Config was updated with kylin.metadata.url : /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta548374980908023= 1586/meta

2017-11-16 16:13:47,105 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79= : Using metadata url /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta548374980908023= 1586/meta for resource store

2017-11-16 16:13:47,105 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceStore:79= : Using metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:13:47,155 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:76 : = hdfs meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:47,157 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] persistence.ResourceTool:167= : Copy from /usr/local/kylin/bin/../tomcat/temp/kylin_job_meta5483749809= 080231586/meta to org.apache.kylin.storage.hdfs.HDFSResourceStore@2f8908ea<= /u>

2017-11-16 16:13:47,157 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /cube/test_sample_cube.json

2017-11-16 16:13:47,157 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube/test_sample_cube.json

2017-11-16 16:13:47,197 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /cube_desc/test_sample_cube.json

2017-11-16 16:13:47,197 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube_desc/test_sample_cube.json<= /u>

2017-11-16 16:13:47,320 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /cube_statistics/test_sample_cube/d4ccd867-e0ae-4ec2-b2ff-fc5f1= cc00dbb.seq

2017-11-16 16:13:47,320 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube_statistics/test_sample_cube/d4ccd867-= e0ae-4ec2-b2ff-fc5f1cc00dbb.seq

2017-11-16 16:13:48,998 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8= ed-e18a5a4eba5a.dict

2017-11-16 16:13:48,998 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE= /6dc5b112-399a-43cd-a8ed-e18a5a4eba5a.dict

2017-11-16 16:13:49,031 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4= 912-8570-111fe71bd39d.dict

2017-11-16 16:13:49,031 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE= _HINDI/67e76885-3299-4912-8570-111fe71bd39d.dict

2017-11-16 16:13:49,064 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de= 55ee3ecbf.dict

2017-11-16 16:13:49,064 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c06= 3-e1d4-4da6-a092-4de55ee3ecbf.dict

2017-11-16 16:13:49,097 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c402= 4726-85bb-484c-b5ca-4f1c2fb4dec0.dict

2017-11-16 16:13:49,098 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT= _INCIDENT_TIME_TO_COMPLETE/c4024726-85bb-484c-b5ca-4f1c2fb4dec0.d= ict

2017-11-16 16:13:49,131 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b= 72e-696c131dbe2d.dict

2017-11-16 16:13:49,131 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_I= D/11208e17-c71d-42d0-b72e-696c131dbe2d.dict

2017-11-16 16:13:49,164 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05= -9a43099c1208.dict

2017-11-16 16:13:49,164 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9a43099c1208.dict

2017-11-16 16:13:49,197 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34= -f6dc6e6db62d.dict

2017-11-16 16:13:49,198 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d.dict

2017-11-16 16:13:49,231 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c= -49c6-b640-92ad38ef16d0.dict

2017-11-16 16:13:49,231 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT= _TIME_TO_COMPLETE/be3cb393-9f8c-49c6-b640-92ad38ef16d0.dict

2017-11-16 16:13:49,306 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-874= 5-8af522641496.dict

2017-11-16 16:13:49,306 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.INDEX_EVENT/IT= _TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641496.dict

2017-11-16 16:13:49,339 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55= -412e-b7d6-a2ff093aaf56.dict

2017-11-16 16:13:49,339 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEG= ORY_NAME/2c7d1eea-8a55-412e-b7d6-a2ff093aaf56.dict<= /p>

2017-11-16 16:13:49,372 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-= f62c-4d20-a222-954e1c13b537.dict

2017-11-16 16:13:49,373 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAM= ERA_LOCATION/44c5ee32-f62c-4d20-a222-954e1c13b537.dict<= /u>

2017-11-16 16:13:49,406 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7= -6190-4951-afd7-163137f3988e.dict

2017-11-16 16:13:49,406 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAM= ERA_SENSOR_ID/19c299e7-6190-4951-afd7-163137f3988e.dict=

2017-11-16 16:13:49,439 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ad= e3-4537-8198-93bc6786a0e8.dict

2017-11-16 16:13:49,439 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-93bc6786a0e8.dict<= /p>

2017-11-16 16:13:49,472 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-20= 92-4d4c-83df-6e7511b0b57a.dict

2017-11-16 16:13:49,473 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCI= DENTS/V_DEVICE_NAME/328f7642-2092-4d4c-83df-6e7511b0b57a.dict<= /u>

2017-11-16 16:13:49,506 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7e= c0-4074-9a6f-05977b6e3260.dict

2017-11-16 16:13:49,506 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7ec0-4074-9a6f-05977b6e3260.dict

2017-11-16 16:13:49,539 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-= 42e4-4685-8a7a-0e1e4ef7dcd3.dict

2017-11-16 16:13:49,539 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8a7a-0e1e4ef7dcd3.dict<= /u>

2017-11-16 16:13:49,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58= -19b3-46cb-bcf7-64d1b7b10fe0.dict

2017-11-16 16:13:49,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58-19b3-46cb-bcf7-64d1b7b10fe0.dict

2017-11-16 16:13:49,606 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2e= c9-4d28-81dc-fdac31787942.dict

2017-11-16 16:13:49,606 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2ec9-4d28-81dc-fdac31787942.dict

2017-11-16 16:13:49,639 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d53169= 33-8229-46c0-bb82-fd0cf01bede5.dict

2017-11-16 16:13:49,639 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCI= DENTS/V_INCIDENT_DETAILS/d5316933-8229-46c0-bb82-fd0cf01bede5.dict

2017-11-16 16:13:49,672 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46= -48f6-8531-47afa69bea83.dict

2017-11-16 16:13:49,673 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46-48f6-8531-47afa69bea83.dict<= /p>

2017-11-16 16:13:49,706 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab4= 77386-68e1-4e82-8852-ab9bf2a6a114.dict

2017-11-16 16:13:49,706 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477386-68e1-4e82-8852-ab9bf2a6a114.dict

2017-11-16 16:13:49,739 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff728= 7-a2f9-403c-9b0b-64a7b03f8f84.dict

2017-11-16 16:13:49,739 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCI= DENTS/V_INCIDENT_STATUS/4eff7287-a2f9-403c-9b0b-64a7b03f8f84.dict=

2017-11-16 16:13:49,772 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce= 7f-4553-9622-aeaf4fe878b6.dict

2017-11-16 16:13:49,773 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-9622-aeaf4fe878b6.dict

2017-11-16 16:13:49,806 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4f= c1-9943-9db12084dea3.dict

2017-11-16 16:13:49,806 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4fc1-9943-9db12084dea3.dict

2017-11-16 16:13:49,839 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4= 421-b390-7bc1693c3f09.dict

2017-11-16 16:13:49,839 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7bc1693c3f09.dict

2017-11-16 16:13:49,872 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677= f-29b9-4952-a7a5-c0119e3da826.dict

2017-11-16 16:13:49,873 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POL= ICE_STATION_ID/ae15677f-29b9-4952-a7a5-c0119e3da826.dict

2017-11-16 16:13:49,906 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e4= 9d33-8260-4f5a-ab01-6e5ac7152672.dict

2017-11-16 16:13:49,906 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCI= DENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5a-ab01-6e5ac7152672.d= ict

2017-11-16 16:13:49,939 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44= d4-a85e-223ee929068d.dict

2017-11-16 16:13:49,939 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/dict/TRINITYICCC.V_ANALYST_INCI= DENTS/V_THE_GEOM/dfb959be-3710-44d4-a85e-223ee929068d.dict=

2017-11-16 16:13:49,972 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /kylin.properties

2017-11-16 16:13:49,973 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/kylin.properties

2017-11-16 16:13:50,006 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /model_desc/test_sample_model.json

2017-11-16 16:13:50,006 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/model_desc/test_sample_model.json

2017-11-16 16:13:50,039 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /project/test_sample.json

2017-11-16 16:13:50,039 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/project/test_sample.json

2017-11-16 16:13:50,072 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /table/TRINITYICCC.INDEX_EVENT.json

2017-11-16 16:13:50,073 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4c= cd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/table/TRINITYICCC.INDEX_EVENT.j= son

2017-11-16 16:13:50,139 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /table/TRINITYICCC.PINMAPPING_FACT.json

2017-11-16 16:13:50,139 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/table/TRINITYICCC.PINMAPPING_FACT.jso= n

2017-11-16 16:13:50,181 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:170 := res path : /table/TRINITYICCC.V_ANALYST_INCIDENTS.json

2017-11-16 16:13:50,181 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] hdfs.HDFSResourceStore:172 := put resource : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/table/TRINITYICCC.V_ANALYST_INCIDENTS= .json

2017-11-16 16:13:50,214 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] common.KylinConfigBase:76 : SPARK_HOME was set to /usr/local/kylin/spark

2017-11-16 16:13:50,215 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:149 : = cmd: export HADOOP_CONF_DIR=3D/usr/local/kylin/hadoop-conf && /usr/local/kylin/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry=C2=A0 --conf spark.executor.in= stances=3D1=C2=A0 --conf spark.yarn.archive=3Dhdfs://trinitybdhdfs/kylin/spark/spar= k-libs.jar=C2=A0 --conf spark.yarn.queue=3Ddefault=C2=A0 --conf spark.yarn.am.extraJavaOptions=3D= -Dhdp.version=3D2.4.3.0-227=C2=A0 --conf spark.history.fs.logDirectory=3Dhdfs:///kylin/spark-history=C2=A0 --co= nf spark.driver.extraJavaOptions=3D-Dhdp.version=3D2.4.3.0-227=C2=A0 --conf spark.master=3Dlocal[*]=C2=A0 --conf spark.executor.extraJavaOptions=3D-Dhdp.version=3D2.4.3.0-227=C2= =A0 --conf spark.hadoop.yarn.timeline-service.enabled=3Dfalse=C2=A0 --conf spark.executor.memory=3D1G=C2=A0 --conf spark.eventLog.enabled=3Dtrue=C2=A0= --conf spark.eventLog.dir=3Dhdfs:///kylin/spark-history=C2=A0 --conf spark.executor.cores=3D2 --jars /usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.jar,/= usr/hdp/2.4.3.0-227/hbase/lib/metrics-co= re-2.2.0.jar,/usr/hdp/2.4.3.0-227/hbase/lib/guava-12.0.1.jar= , /usr/local/kylin/lib/kylin-job-2.2.0.jar -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2= ff_fc5f1cc00dbb -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-= 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

2017-11-16 16:13:51,639 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : S= parkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLayer -hive= Table default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2= ff_fc5f1cc00dbb -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

2017-11-16 16:13:51,649 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : A= bstract Application args:-hiveTable default.kylin_intermediate_test_sample_cub= e_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc00dbb -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadat= a@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/= d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

2017-11-16 16:13:51,725 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:51 INFO spark.SparkContext: Running Spark version 2.2.0=

2017-11-16 16:13:52,221 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkContext: Submitted application: Cubing for:test_sample_cube segment d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:52,247 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls to: hdfs<= /u>

2017-11-16 16:13:52,248 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls to: hdfs

2017-11-16 16:13:52,248 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SecurityManager: Changing view acls groups to: <= u>

2017-11-16 16:13:52,249 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SecurityManager: Changing modify acls groups to:

2017-11-16 16:13:52,249 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SecurityManager: SecurityManager: authentication disabl= ed; ui acls disabled; users=C2=A0 with view permissions: Set(hdfs); groups with view permissions: Set(); users=C2=A0 with modify permissions: Set(hdfs); gr= oups with modify permissions: Set()

2017-11-16 16:13:52,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO util.Utils: Successfully started service 'sparkDriver'= ; on port 42799.

2017-11-16 16:13:52,593 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkEnv: Registering MapOutputTracker

2017-11-16 16:13:52,613 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkEnv: Registering BlockManagerMaster<= /p>

2017-11-16 16:13:52,616 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology in= formation

2017-11-16 16:13:52,617 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEn= dpoint up

2017-11-16 16:13:52,634 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-b8d6ec0d-8a73-4ce6-9dbf-64002d5e2a62

2017-11-16 16:13:52,655 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 M= B

2017-11-16 16:13:52,712 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkEnv: Registering OutputCommitCoordinator=

2017-11-16 16:13:52,790 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO util.log: Logging initialized @2149ms

2017-11-16 16:13:52,859 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO server.Server: jetty-9.3.z-SNAPSHOT

2017-11-16 16:13:52,875 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO server.Server: Started @2235ms

2017-11-16 16:13:52,896 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO server.AbstractConnector: Started ServerConnector@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

2017-11-16 16:13:52,897 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO util.Utils: Successfully started service 'SparkUI' on= port 4040.

2017-11-16 16:13:52,923 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@f5c79a6{/jobs,null,AVAILABLE,@Spark= }

2017-11-16 16:13:52,923 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1fc793c2{/jobs/json,null,AVAILABLE,= @Spark}

2017-11-16 16:13:52,924 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@329a1243{/jobs/job,null,AVAILABLE,@= Spark}

2017-11-16 16:13:52,925 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@27f9e982{/jobs/job/json,null,AVAILA= BLE,@Spark}

2017-11-16 16:13:52,926 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@37d3d232{/stages,null,AVAILABLE,@Sp= ark}

2017-11-16 16:13:52,926 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@581d969c{/stages/json,null,AVAILABL= E,@Spark}

2017-11-16 16:13:52,927 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2b46a8c1{/stages/stage,null,AVAILAB= LE,@Spark}

2017-11-16 16:13:52,928 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5851bd4f{/stages/stage/json,null,AV= AILABLE,@Spark}

2017-11-16 16:13:52,929 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2f40a43{/stages/pool,null,AVAILABLE= ,@Spark}

2017-11-16 16:13:52,930 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@69c43e48{/stages/pool/json,null,AVA= ILABLE,@Spark}

2017-11-16 16:13:52,930 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a80515c{/storage,null,AVAILABLE,@S= park}

2017-11-16 16:13:52,931 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c807b1d{/storage/json,null,AVAILAB= LE,@Spark}

2017-11-16 16:13:52,932 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b39fd82{/storage/rdd,null,AVAILABL= E,@Spark}

2017-11-16 16:13:52,933 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@21680803{/storage/rdd/json,null,AVA= ILABLE,@Spark}

2017-11-16 16:13:52,933 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c8b96ec{/environment,null,AVAILABLE= ,@Spark}

2017-11-16 16:13:52,934 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2d8f2f3a{/environment/json,null,AVA= ILABLE,@Spark}

2017-11-16 16:13:52,935 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7048f722{/executors,null,AVAILABLE,= @Spark}

2017-11-16 16:13:52,936 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58a55449{/executors/json,null,AVAIL= ABLE,@Spark}

2017-11-16 16:13:52,936 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e0ff644{/executors/threadDump,null= ,AVAILABLE,@Spark}

2017-11-16 16:13:52,937 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2a2bb0eb{/executors/threadDump/json= ,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,945 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2d0566ba{/static,null,AVAILABLE,@Sp= ark}

2017-11-16 16:13:52,946 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@29d2d081{/,null,AVAILABLE,@Spark}

2017-11-16 16:13:52,948 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58783f6c{/api,null,AVAILABLE,@Spark= }

2017-11-16 16:13:52,948 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@88d6f9b{/jobs/job/kill,null,AVAILAB= LE,@Spark}

2017-11-16 16:13:52,949 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@475b7792{/stages/stage/kill,null,AV= AILABLE,@Spark}

2017-11-16 16:13:52,952 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.13= 5:4040

2017-11-16 16:13:52,977 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkContext: Added JAR file:/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.= jar at spark://192.168.1.135:42799/jars/htrace-core-3.1.0= -incubating.jar with timestamp 1510829032976

2017-11-16 16:13:52,977 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkContext: Added JAR file:/usr/hdp/2.4.3.0-227/= hbase/lib/metrics-core-2.2.0.jar at spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar with timestamp 1510829032977

2017-11-16 16:13:52,977 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkContext: Added JAR file:/usr/hdp/2.4.3.0-227/hbase/lib/guava-12.0.1.jar at spark://192.168.1.135:42799/jars/guava-12.0.1.jar with timesta= mp 1510829032977

2017-11-16 16:13:52,978 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:52 INFO spark.SparkContext: Added JAR file:/usr/local/kylin/lib/kylin-job-2.2.0.jar at spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar with t= imestamp 1510829032978

2017-11-16 16:13:53,042 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO executor.Executor: Starting executor ID driver on host localh= ost

2017-11-16 16:13:53,061 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService'= ; on port 33164.

2017-11-16 16:13:53,066 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO netty.NettyBlockTransferService: Server created on 192.168.1.135:3316= 4

2017-11-16 16:13:53,068 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replic= ation policy

2017-11-16 16:13:53,070 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.135, 33164, None)

2017-11-16 16:13:53,073 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO storage.BlockManagerMasterEndpoint: Registering block ma= nager 192.168.1.135:3316= 4 with 366.3 MB RAM, BlockManagerId(driver, 192.168.1.135, 33164, None)

2017-11-16 16:13:53,076 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO storage.BlockManagerMaster: Registered BlockManager BlockMana= gerId(driver, 192.168.1.135, 33164, None)

2017-11-16 16:13:53,076 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.135, 33164, None)

2017-11-16 16:13:53,225 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b9c1b51{/metrics/json,null,AVAILAB= LE,@Spark}

2017-11-16 16:13:54,057 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO scheduler.EventLoggingListener: Logging events to hdfs:///kylin/spark-history/local-1510829033012

2017-11-16 16:13:54,093 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO common.AbstractHadoopJob: Ready to load KylinConfig from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,250 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO cube.CubeManager: Initializing CubeManager with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,254 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO persistence.ResourceStore: Using metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:13:54,282 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO hdfs.HDFSResourceStore: hdfs meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,295 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO cube.CubeManager: Loading Cube from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube

2017-11-16 16:13:54,640 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO cube.CubeDescManager: Initializing CubeDescManager with confi= g kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,641 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO cube.CubeDescManager: Reloading Cube Metadata from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube_desc

2017-11-16 16:13:54,705 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO project.ProjectManager: Initializing ProjectManager with meta= data url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:13:54,761 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO measure.MeasureTypeFactory: Checking custom measure types fro= m kylin config

2017-11-16 16:13:54,762 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering COUNT_DISTINCT(hllc), class org.apache.kylin.measure.hllc.HLLCMeasureType$Factory<= /u>

2017-11-16 16:13:54,768 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering COUNT_DISTINCT(bitmap= ), class org.apache.kylin.measure.bitmap.BitmapMeasureType$Factory

2017-11-16 16:13:54,774 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering TOP_N(topn), class org.apache.kylin.measure.topn.TopNMeasureType$Factory

2017-11-16 16:13:54,776 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering RAW(raw), class org.apache.kylin.measure.raw.RawMeasureType$Factory

2017-11-16 16:13:54,778 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering EXTENDED_COLUMN(extendedcolumn), class org.apache.kylin.measure.extendedcolumn.ExtendedColumnMeasureType= $Factory

2017-11-16 16:13:54,780 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering PERCENTILE(percentile= ), class org.apache.kylin.measure.percentile.PercentileMeasureType$F= actory

2017-11-16 16:13:54,800 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO metadata.MetadataManager: Reloading data model at /model_desc/test_sample_model.json

2017-11-16 16:13:54,931 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO cube.CubeDescManager: Loaded 1 Cube(s)

2017-11-16 16:13:54,932 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO cube.CubeManager: Reloaded cube test_sample_cube being CUBE[name=3Dtest_sample_cube] having 1 segments

2017-11-16 16:13:54,932 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes

2017-11-16 16:13:54,942 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:54 INFO spark.SparkCubingByLayer: RDD Output path: hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/test_sample_cube/cuboid/

2017-11-16 16:13:55,758 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO spark.SparkCubingByLayer: All measure are normal (agg on all cuboids) ? : true

2017-11-16 16:13:55,868 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO internal.SharedState: loading hive config file: file:/usr/local/spark/conf/hive-site.xml

2017-11-16 16:13:55,888 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO internal.SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/apps/hive/warehouse').<= /u>

2017-11-16 16:13:55,889 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO internal.SharedState: Warehouse path is '/apps/hive/wareh= ouse'.

2017-11-16 16:13:55,895 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler= @75cf0de5{/SQL,null,AVAILABLE,@Spark}

2017-11-16 16:13:55,895 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@468173fa{/SQL/json,null,AVAILABLE,@= Spark}

2017-11-16 16:13:55,896 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@27e2287c{/SQL/execution,null,AVAILA= BLE,@Spark}

2017-11-16 16:13:55,896 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2cd388f5{/SQL/execution/json,null,A= VAILABLE,@Spark}

2017-11-16 16:13:55,898 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler= @4207852d{/static/sql,null,AVAILABLE,@Spark}

2017-11-16 16:13:56,397 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:56 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.

2017-11-16 16:13:57,548 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore with URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:57,583 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:57,709 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO session.SessionState: Created local directory: /tmp/58660d8b-48ac-4cf0-bd06-6b96018a5482_resources

2017-11-16 16:13:57,732 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO session.SessionState: Created HDFS directory: /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482

2017-11-16 16:13:57,734 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO session.SessionState: Created local directory: /tmp/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482

2017-11-16 16:13:57,738 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO session.SessionState: Created HDFS directory: /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482/_tmp_space.db=

2017-11-16 16:13:57,740 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /apps/hive/warehouse

2017-11-16 16:13:57,751 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=3D58660d8b-48ac-4cf0-bd06-6b96018a5482, clientType=3DHI= VECLI]

2017-11-16 16:13:57,752 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metasto= re.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.Authoriza= tionMetaStoreFilterHook

2017-11-16 16:13:57,756 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO hive.metastore: Trying to connect to metastore with URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:57,757 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,073 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Mestastore configuration hive.metastore.filte= r.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.Authoriza= tionMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImp= l

2017-11-16 16:13:58,073 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:58,075 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,078 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO session.SessionState: Created local directory: /tmp/bd69eb21-01c1-4dd3-b31c-16e065ab4101_resources

2017-11-16 16:13:58,088 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO session.SessionState: Created HDFS directory: /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101

2017-11-16 16:13:58,089 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO session.SessionState: Created local directory: /tmp/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101

2017-11-16 16:13:58,096 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO session.SessionState: Created HDFS directory: /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101/_tmp_space.db=

2017-11-16 16:13:58,097 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO client.HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /apps/hive/warehouse

2017-11-16 16:13:58,098 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAcc= essController for session context : HiveAuthzSessionContext [sessionString=3Dbd69eb21-01c1-4dd3-b31c-16e065ab4101, clientType=3DHI= VECLI]

2017-11-16 16:13:58,098 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl t= o org.apache.hadoop.hive.ql.security.authorization.plugin.Authoriza= tionMetaStoreFilterHook

2017-11-16 16:13:58,098 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:58,100 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,139 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoo= rdinator endpoint

2017-11-16 16:13:58,143 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO execution.SparkSqlParser: Parsing command: default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2= ff_fc5f1cc00dbb

2017-11-16 16:13:58,292 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Trying to connect to metastore with URI thrift://master01.trinitymobility.local:9083

2017-11-16 16:13:58,294 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

2017-11-16 16:13:58,345 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,355 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,355 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,356 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: timestamp

2017-11-16 16:13:58,357 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,357 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,358 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,358 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double

2017-11-16 16:13:58,358 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: double

2017-11-16 16:13:58,359 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,359 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,359 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,360 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,360 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,360 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,361 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,361 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,361 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,362 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,362 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: string

2017-11-16 16:13:58,362 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: boolean<= /u>

2017-11-16 16:13:58,363 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,363 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,363 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,363 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:13:58,364 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:13:58 INFO parser.CatalystSqlParser: Parsing command: int<= /p>

2017-11-16 16:14:00,368 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0 stored as values in mem= ory (estimated size 373.5 KB, free 365.9 MB)

2017-11-16 16:14:00,685 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes = in memory (estimated size 35.8 KB, free 365.9 MB)

2017-11-16 16:14:00,688 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:00 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory = on 192.168.1.135:3316= 4 (size: 35.8 KB, free: 366.3 MB)

2017-11-16 16:14:00,691 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:00 INFO spark.SparkContext: Created broadcast 0 from

2017-11-16 16:14:01,094 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46= -48f6-8531-47afa69bea83.dict

2017-11-16 16:14:01,106 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_INCIDENT_TYPE/0c92e610-ce7f-4553-9622-aeaf4fe878b6.dic= t

2017-11-16 16:14:01,111 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7= -6190-4951-afd7-163137f3988e.dict

2017-11-16 16:14:01,115 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7e= c0-4074-9a6f-05977b6e3260.dict

2017-11-16 16:14:01,119 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_DISTRESS_NUMBER/739191fd-42e4-4685-8a7a-0e1e4ef7dcd3.d= ict

2017-11-16 16:14:01,122 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58= -19b3-46cb-bcf7-64d1b7b10fe0.dict

2017-11-16 16:14:01,127 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2e= c9-4d28-81dc-fdac31787942.dict

2017-11-16 16:14:01,131 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a222-954e1c13b537.d= ict

2017-11-16 16:14:01,135 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477= 386-68e1-4e82-8852-ab9bf2a6a114.dict

2017-11-16 16:14:01,139 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4f= c1-9943-9db12084dea3.dict

2017-11-16 16:14:01,142 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7bc1693c3f09.dict=

2017-11-16 16:14:01,146 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933= -8229-46c0-bb82-fd0cf01bede5.dict

2017-11-16 16:14:01,149 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-= a2f9-403c-9b0b-64a7b03f8f84.dict

2017-11-16 16:14:01,152 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5a-ab01-6e5ac715267= 2.dict

2017-11-16 16:14:01,156 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44= d4-a85e-223ee929068d.dict

2017-11-16 16:14:01,159 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677= f-29b9-4952-a7a5-c0119e3da826.dict

2017-11-16 16:14:01,164 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-93bc6786a0e8.dict<= u>

2017-11-16 16:14:01,167 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092= -4d4c-83df-6e7511b0b57a.dict

2017-11-16 16:14:01,171 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55= -412e-b7d6-a2ff093aaf56.dict

2017-11-16 16:14:01,174 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-8745-8af522641496.dict<= /p>

2017-11-16 16:14:01,178 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8= ed-e18a5a4eba5a.dict

2017-11-16 16:14:01,181 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4= 912-8570-111fe71bd39d.dict

2017-11-16 16:14:01,184 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f6dc6e6db62d.dict

2017-11-16 16:14:01,188 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading = DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-4= 9c6-b640-92ad38ef16d0.dict

2017-11-16 16:14:01,191 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c40247= 26-85bb-484c-b5ca-4f1c2fb4dec0.dict

2017-11-16 16:14:01,195 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55= ee3ecbf.dict

2017-11-16 16:14:01,199 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9= a43099c1208.dict

2017-11-16 16:14:01,202 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b= 72e-696c131dbe2d.dict

2017-11-16 16:14:01,261 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO common.CubeStatsReader: Estimating size for layer 0, all cubo= ids are 536870911, total size is 0.010198831558227539

2017-11-16 16:14:01,261 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO spark.SparkCubingByLayer: Partition for spark cubing: 1

2017-11-16 16:14:01,353 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1

2017-11-16 16:14:01,422 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO spark.SparkContext: Starting job: runJob at SparkHadoopMapReduceWriter.scala:88

2017-11-16 16:14:01,543 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO mapred.FileInputFormat: Total input paths to process : 1

2017-11-16 16:14:01,623 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.DAGScheduler: Registering RDD 6 (mapToPair at SparkCubingByLayer.java:170)

2017-11-16 16:14:01,627 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.DAGScheduler: Got job 0 (runJob at SparkHadoopMapReduceWriter.scala:88) with 1 output partitions

2017-11-16 16:14:01,628 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (runJob at SparkHadoopMapReduceWriter.scala:88)

2017-11-16 16:14:01,629 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)

2017-11-16 16:14:01,638 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage= 0)

2017-11-16 16:14:01,652 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170), which ha= s no missing parents

2017-11-16 16:14:01,855 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1 stored as values in mem= ory (estimated size 25.8 KB, free 365.9 MB)

2017-11-16 16:14:01,892 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes = in memory (estimated size 10.7 KB, free 365.9 MB)

2017-11-16 16:14:01,894 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory = on 192.168.1.135:3316= 4 (size: 10.7 KB, free: 366.3 MB)

2017-11-16 16:14:01,896 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006

2017-11-16 16:14:01,922 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170) (first 15 tasks are for partitions Vector(0))<= u>

2017-11-16 16:14:01,924 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:01 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks=

2017-11-16 16:14:02,015 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID= 0, localhost, executor driver, partition 0, ANY, 4978 bytes)

2017-11-16 16:14:02,033 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)

2017-11-16 16:14:02,044 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar = with timestamp 1510829032977

2017-11-16 16:14:02,159 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO client.TransportClientFactory: Successfully created connectio= n to /192.168.1.135:427= 99 after 64 ms (0 ms spent in bootstraps)

2017-11-16 16:14:02,179 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar = to /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7= 729a-5561-48d1-bfd5-e3459b0dc20e/fetchFileTemp5518529699147501519.tmp

2017-11-16 16:14:02,259 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/metrics-core-2.2.0.jar to class loader

2017-11-16 16:14:02,260 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.168.1.135:42799/jars/guava-12.0.1.jar with timesta= mp 1510829032977

2017-11-16 16:14:02,261 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/jars/guava-12.0.1.jar to /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729= a-5561-48d1-bfd5-e3459b0dc20e/fetchFileTemp23683684527060930= 62.tmp

2017-11-16 16:14:02,278 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/guava-12.0.1.jar to class loader

2017-11-16 16:14:02,278 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.1= 68.1.135:42799/jars/htrace-core-3.1.0-incubating.jar with timestamp 1510829032976

2017-11-16 16:14:02,279 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.13= 5:42799/jars/htrace-core-3.1.0-incubating.jar to /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729= a-5561-48d1-bfd5-e3459b0dc20e/fetchFileTemp45393749103399581= 67.tmp

2017-11-16 16:14:02,295 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/htrace-core-3.1.0-incubating.j= ar to class loader

2017-11-16 16:14:02,295 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar with t= imestamp 1510829032978

2017-11-16 16:14:02,296 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar to /tm= p/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729a-5= 561-48d1-bfd5-e3459b0dc20e/fetchFileTemp9086394010889635270.= tmp

2017-11-16 16:14:02,418 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/kylin-job-2.2.0.jar to class loader

2017-11-16 16:14:02,540 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO rdd.HadoopRDD: Input split: hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/kylin_intermediate_test_sample_cube_d4ccd867_= e0ae_4ec2_b2ff_fc5f1cc00dbb/000000_0:0+19534

2017-11-16 16:14:02,569 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library

2017-11-16 16:14:02,570 INFO =C2=A0[Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor [.deflate]<= /u>

2017-11-16 16:14:02,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor [.deflate]<= /u>

2017-11-16 16:14:02,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor [.deflate]<= /u>

2017-11-16 16:14:02,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:02 INFO compress.CodecPool: Got brand-new decompressor [.deflate]<= /u>

2017-11-16 16:14:03,035 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 251.01178 ms<= u>

2017-11-16 16:14:03,106 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO codegen.CodeGenerator: Code generated in 55.530064 ms<= u>

2017-11-16 16:14:03,148 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO common.AbstractHadoopJob: Ready to load KylinConfig from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,170 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO cube.CubeManager: Initializing CubeManager with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,170 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO persistence.ResourceStore: Using metadata url kylin_metadata@= hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4= ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:14:03,190 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO hdfs.HDFSResourceStore: hdfs meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,194 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO cube.CubeManager: Loading Cube from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube

2017-11-16 16:14:03,198 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO cube.CubeDescManager: Initializing CubeDescManager with confi= g kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,198 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO cube.CubeDescManager: Reloading Cube Metadata from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube_desc

2017-11-16 16:14:03,206 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO project.ProjectManager: Initializing ProjectManager with meta= data url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_m= etadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:03,213 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN cachesync.Broadcaster: More than one singleton exist

2017-11-16 16:14:03,213 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN project.ProjectManager: More than one singleton exist<= u>

2017-11-16 16:14:03,232 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO metadata.MetadataManager: Reloading data model at /model_desc/test_sample_model.json

2017-11-16 16:14:03,237 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN metadata.MetadataManager: More than one singleton exist, curr= ent keys: 1464031233,1545268424

2017-11-16 16:14:03,239 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO cube.CubeDescManager: Loaded 1 Cube(s)

2017-11-16 16:14:03,239 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN cube.CubeDescManager: More than one singleton exist=

2017-11-16 16:14:03,239 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO cube.CubeManager: Reloaded cube test_sample_cube being CUBE[name=3Dtest_sample_cube] having 1 segments

2017-11-16 16:14:03,239 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO cube.CubeManager: Loaded 1 cubes, fail on 0 cubes

2017-11-16 16:14:03,239 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN cube.CubeManager: More than one singleton exist=

2017-11-16 16:14:03,239 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN cube.CubeManager: type: class org.apache.kylin.common.Ky= linConfig reference: 1464031233

2017-11-16 16:14:03,239 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN cube.CubeManager: type: class org.apache.kylin.common.Ky= linConfig reference: 1545268424

2017-11-16 16:14:03,283 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 WARN dict.DictionaryManager: More than one singleton exist<= u>

2017-11-16 16:14:03,283 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46= -48f6-8531-47afa69bea83.dict

2017-11-16 16:14:03,287 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce= 7f-4553-9622-aeaf4fe878b6.dict

2017-11-16 16:14:03,290 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading= DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7= -6190-4951-afd7-163137f3988e.dict

2017-11-16 16:14:03,294 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7e= c0-4074-9a6f-05977b6e3260.dict

2017-11-16 16:14:03,297 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-= 42e4-4685-8a7a-0e1e4ef7dcd3.dict

2017-11-16 16:14:03,300 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58= -19b3-46cb-bcf7-64d1b7b10fe0.dict

2017-11-16 16:14:03,303 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2e= c9-4d28-81dc-fdac31787942.dict

2017-11-16 16:14:03,309 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_CAMERA_LOCATION/44c5ee32-f62c-4d20-a222-954e1c13b537.d= ict

2017-11-16 16:14:03,315 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477= 386-68e1-4e82-8852-ab9bf2a6a114.dict

2017-11-16 16:14:03,321 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4f= c1-9943-9db12084dea3.dict

2017-11-16 16:14:03,329 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_LONGITUDE/eb13b237-61a5-4421-b390-7bc1693c3f09.dict=

2017-11-16 16:14:03,335 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933= -8229-46c0-bb82-fd0cf01bede5.dict

2017-11-16 16:14:03,340 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-= a2f9-403c-9b0b-64a7b03f8f84.dict

2017-11-16 16:14:03,345 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_STATUS_DESCRIPTION/15e49d33-8260-4f5a-ab01-6e5ac715267= 2.dict

2017-11-16 16:14:03,351 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44= d4-a85e-223ee929068d.dict

2017-11-16 16:14:03,357 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677= f-29b9-4952-a7a5-c0119e3da826.dict

2017-11-16 16:14:03,363 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/V_CATEGORY_ID/ab4a3960-ade3-4537-8198-93bc6786a0e8.dict<= u>

2017-11-16 16:14:03,369 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092= -4d4c-83df-6e7511b0b57a.dict

2017-11-16 16:14:03,375 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_I= NCIDENTS/CATEGORY_NAME/2c7d1eea-8a55-412e-b7d6-a2ff093aaf56.dict<= u>

2017-11-16 16:14:03,381 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-874= 5-8af522641496.dict

2017-11-16 16:14:03,386 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8= ed-e18a5a4eba5a.dict

2017-11-16 16:14:03,391 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4912-8570-111fe71bd39d.dict=

2017-11-16 16:14:03,397 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f= 6dc6e6db62d.dict

2017-11-16 16:14:03,402 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-4= 9c6-b640-92ad38ef16d0.dict

2017-11-16 16:14:03,408 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c40247= 26-85bb-484c-b5ca-4f1c2fb4dec0.dict

2017-11-16 16:14:03,414 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55= ee3ecbf.dict

2017-11-16 16:14:03,420 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9= a43099c1208.dict

2017-11-16 16:14:03,425 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b= 72e-696c131dbe2d.dict

2017-11-16 16:14:04,031 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 13= 47 bytes result sent to driver

2017-11-16 16:14:04,072 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID= 0) in 2084 ms on localhost (executor driver) (1/1)

2017-11-16 16:14:04,075 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool

2017-11-16 16:14:04,082 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (mapToPair at SparkCubingByLayer.java:170) finished in 2.118 s

2017-11-16 16:14:04,082 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: looking for newly runnable stages<= /u>

2017-11-16 16:14:04,083 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: running: Set()

2017-11-16 16:14:04,083 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)=

2017-11-16 16:14:04,084 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: failed: Set()

2017-11-16 16:14:04,088 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:238), which ha= s no missing parents

2017-11-16 16:14:04,134 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2 stored as values in mem= ory (estimated size 82.3 KB, free 365.8 MB)

2017-11-16 16:14:04,153 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes = in memory (estimated size 31.6 KB, free 365.8 MB)

2017-11-16 16:14:04,154 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory = on 192.168.1.135:3316= 4 (size: 31.6 KB, free: 366.2 MB)

2017-11-16 16:14:04,155 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1006

2017-11-16 16:14:04,158 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:= 238) (first 15 tasks are for partitions Vector(0))

2017-11-16 16:14:04,158 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks=

2017-11-16 16:14:04,160 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID= 1, localhost, executor driver, partition 0, ANY, 4621 bytes)

2017-11-16 16:14:04,160 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)

2017-11-16 16:14:04,204 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty= blocks out of 1 blocks

2017-11-16 16:14:04,206 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fe= tches in 6 ms

2017-11-16 16:14:04,315 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO memory.MemoryStore: Block rdd_7_0 stored as bytes in memory (estimated size 49.2 KB, free 365.7 MB)

2017-11-16 16:14:04,315 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO storage.BlockManagerInfo: Added rdd_7_0 in memory on 192.168.1.135:3316= 4 (size: 49.2 KB, free: 366.2 MB)

2017-11-16 16:14:04,331 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1

2017-11-16 16:14:04,334 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1

2017-11-16 16:14:04,359 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO common.AbstractHadoopJob: Ready to load KylinConfig from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:04,377 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO cube.CubeDescManager: Initializing CubeDescManager with confi= g kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:04,377 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO persistence.ResourceStore: Using metadata url kylin_metadata@= hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4= ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

2017-11-16 16:14:04,393 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO hdfs.HDFSResourceStore: hdfs meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:04,394 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO cube.CubeDescManager: Reloading Cube Metadata from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube_desc

2017-11-16 16:14:04,400 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO project.ProjectManager: Initializing ProjectManager with meta= data url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

2017-11-16 16:14:04,406 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 WARN cachesync.Broadcaster: More than one singleton exist

2017-11-16 16:14:04,406 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 WARN project.ProjectManager: More than one singleton exist<= u>

2017-11-16 16:14:04,423 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO metadata.MetadataManager: Reloading data model at /model_desc/test_sample_model.json

2017-11-16 16:14:04,427 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 WARN metadata.MetadataManager: More than one singleton exist, curr= ent keys: 1464031233,1545268424,1474775600

2017-11-16 16:14:04,428 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO cube.CubeDescManager: Loaded 1 Cube(s)

2017-11-16 16:14:04,428 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 WARN cube.CubeDescManager: More than one singleton exist=

2017-11-16 16:14:04,498 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO output.FileOutputCommitter: Saved output of task 'attempt_20171116161401_0001_r_000000_0' to hdfs://trinitybdhd= fs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4-9eea-814206fb79e= 3/test_sample_cube/cuboid/level_base_cuboid/_temporary/0/task_20171116161401_0001_r_000000

2017-11-16 16:14:04,499 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO mapred.SparkHadoopMapRedUtil: attempt_20171116161401_0001_r_000000_0: Committed

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 ERROR executor.Executor: Exception in task 0.0 in stage 1.0 (TID 1= )

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : N= ote: To register this class use: kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCom= mitMessage.class);

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

2017-11-16 16:14:04,517 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,543 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Class= is not registered: org.apache.spar= k.internal.io.FileCommitProtocol$TaskCommitMessage<= /u>

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : N= ote: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,544 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : <= u>

2017-11-16 16:14:04,546 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 ERROR scheduler.TaskSetManager: Task 0 in stage 1.0 failed 1 times= ; aborting job

2017-11-16 16:14:04,547 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool

2017-11-16 16:14:04,551 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1=

2017-11-16 16:14:04,552 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: ResultStage 1 (runJob at SparkHadoopMapReduceWriter.scala:88) failed in 0.393 s due to Job abor= ted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: = Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.io.<= wbr>FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,552 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : N= ote: To register this class use: kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCom= mitMessage.class);

2017-11-16 16:14:04,552 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

2017-11-16 16:14:04,552 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

2017-11-16 16:14:04,552 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : <= u>

2017-11-16 16:14:04,553 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : D= river stacktrace:

2017-11-16 16:14:04,557 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.DAGScheduler: Job 0 failed: runJob at SparkHadoopMapReduceWriter.scala:88, took 3.135125 s

2017-11-16 16:14:04,559 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 ERROR io.SparkHadoopMapReduceWriter: Aborting job job_20171116161401_0008.

2017-11-16 16:14:04,559 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : org.apache.spark.SparkException: Job aborted due to stage failure: Tas= k 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (= TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Cl= ass is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,559 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : N= ote: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

2017-11-16 16:14:04,559 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

2017-11-16 16:14:04,559 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

2017-11-16 16:14:04,559 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : <= u>

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : D= river stacktrace:

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGSchedule= r.scala:1499)

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1487)

2017-11-16 16:14:04,560 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1486)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ResizableArray$class.foreach(ResizableAr= ray.scala:59)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48= )

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.s= cala:1486)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.Option.foreach(Option.scala:257)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(= DAGScheduler.scala:814)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnRecei= ve(DAGScheduler.scala:1714)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1669)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1658)

2017-11-16 16:14:04,561 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala= :630)

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)<= /u>

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)<= /u>

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)<= /u>

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.internal.i= o.SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduce= Writer.scala:88)

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085)=

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:151)

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:112)

2017-11-16 16:14:04,562 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(<= wbr>PairRDDFunctions.scala:1084)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(<= wbr>JavaPairRDD.scala:831)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(Spark= CubingByLayer.java:238)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCub= ingByLayer.java:192)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.AbstractApplication.execute(Abstract= Application.java:37)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)<= u>

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce= ssorImpl.java:62)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:498)

2017-11-16 16:14:04,563 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$Spar= kSubmit$$runMain(SparkSubmit.scala:755)

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scal= a:180)

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205= )

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)<= u>

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : C= aused by: java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : N= ote: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

2017-11-16 16:14:04,564 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

2017-11-16 16:14:04,565 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

2017-11-16 16:14:04,565 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

2017-11-16 16:14:04,565 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

2017-11-16 16:14:04,565 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,570 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : E= xception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer

2017-11-16 16:14:04,570 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.AbstractApplication.execute(Abstract= Application.java:42)

2017-11-16 16:14:04,570 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)<= u>

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce= ssorImpl.java:62)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:498)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$Spar= kSubmit$$runMain(SparkSubmit.scala:755)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scal= a:180)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205= )

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)<= u>

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : C= aused by: org.apache.spark.SparkException: Job aborted.

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.internal.i= o.SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduce= Writer.scala:107)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085)=

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:151)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:112)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

2017-11-16 16:14:04,571 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(<= wbr>PairRDDFunctions.scala:1084)

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(<= wbr>JavaPairRDD.scala:831)

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(Spark= CubingByLayer.java:238)

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCub= ingByLayer.java:192)

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.AbstractApplication.execute(Abstract= Application.java:37)

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ... 10 more

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : C= aused by: org.apache.spark.SparkException: Job aborted due to stage failure: Tas= k 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (= TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Cl= ass is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : N= ote: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

2017-11-16 16:14:04,572 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : <= u>

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : D= river stacktrace:

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGSchedule= r.scala:1499)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1487)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1486)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ResizableArray$class.foreach(ResizableAr= ray.scala:59)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48= )

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.s= cala:1486)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.Option.foreach(Option.scala:257)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(= DAGScheduler.scala:814)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnRecei= ve(DAGScheduler.scala:1714)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1669)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1658)

2017-11-16 16:14:04,573 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala= :630)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)<= /u>

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)<= /u>

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)<= /u>

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.internal.i= o.SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduce= Writer.scala:88)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ... 21 more

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : C= aused by: java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : N= ote: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

2017-11-16 16:14:04,574 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : = =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:04,575 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO spark.SparkContext: Invoking stop() from shutdown hook=

2017-11-16 16:14:04,579 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO server.AbstractConnector: Stopped Spark@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

2017-11-16 16:14:04,581 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.1.135:4040<= /p>

2017-11-16 16:14:04,636 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!

2017-11-16 16:14:04,643 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO memory.MemoryStore: MemoryStore cleared

2017-11-16 16:14:04,644 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO storage.BlockManager: BlockManager stopped

2017-11-16 16:14:04,649 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO storage.BlockManagerMaster: BlockManagerMaster stopped=

2017-11-16 16:14:04,651 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoin<= wbr>t: OutputCommitCoordinator stopped!

2017-11-16 16:14:04,653 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO spark.SparkContext: Successfully stopped SparkContext<= u>

2017-11-16 16:14:04,653 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO util.ShutdownHookManager: Shutdown hook called<= /p>

2017-11-16 16:14:04,654 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:38 : 1= 7/11/16 16:14:04 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6

2017-11-16 16:14:05,140 ERROR [Scheduler 1211098754 = Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] spark.SparkExecutable:156 : = error run spark job:

java.io.IOException: OS command error exit with retu= rn code: 1, error message: SparkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.ky= lin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc= 00dbb -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadat= a@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/= d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

Abstract Application args:-hiveTable default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2= ff_fc5f1cc00dbb -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-= 68ac-48e4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

17/11/16 16:13:51 INFO spark.SparkContext: Running S= park version 2.2.0

17/11/16 16:13:52 INFO spark.SparkContext: Submitted application: Cubing for:test_sample_cube segment d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:52 INFO spark.SecurityManager: Changi= ng view acls to: hdfs

17/11/16 16:13:52 INFO spark.SecurityManager: Changi= ng modify acls to: hdfs

17/11/16 16:13:52 INFO spark.SecurityManager: Changi= ng view acls groups to:

17/11/16 16:13:52 INFO spark.SecurityManager: Changi= ng modify acls groups to:

17/11/16 16:13:52 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users=C2=A0 wit= h view permissions: Set(hdfs); groups with view permissions: Set(); users=C2= =A0 with modify permissions: Set(hdfs); groups with modify permissions: Set()

17/11/16 16:13:52 INFO util.Utils: Successfully star= ted service 'sparkDriver' on port 42799.

17/11/16 16:13:52 INFO spark.SparkEnv: Registering MapOutputTracker

17/11/16 16:13:52 INFO spark.SparkEnv: Registering BlockManagerMaster

17/11/16 16:13:52 INFO storage.BlockManagerMast= erEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topol= ogy information

17/11/16 16:13:52 INFO storage.BlockManagerMast= erEndpoint: BlockManagerMasterEndpoint up

17/11/16 16:13:52 INFO storage.DiskBlockManager: Cre= ated local directory at /tmp/blockmgr-b8d6ec0d-8a73-4ce6-9dbf-64002d5e2a62<= u>

17/11/16 16:13:52 INFO memory.MemoryStore: MemorySto= re started with capacity 366.3 MB

17/11/16 16:13:52 INFO spark.SparkEnv: Registering OutputCommitCoordinator

17/11/16 16:13:52 INFO util.log: Logging initialized= @2149ms

17/11/16 16:13:52 INFO server.Server: jetty-9.3.z-SN= APSHOT

17/11/16 16:13:52 INFO server.Server: Started @2235m= s

17/11/16 16:13:52 INFO server.AbstractConnector: Sta= rted ServerConnector@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

17/11/16 16:13:52 INFO util.Utils: Successfully star= ted service 'SparkUI' on port 4040.

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@f5c79a6{/jobs,null,AVAILABLE,@Spark= }

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@1fc793c2{/jobs/json,null,AVAILABLE,= @Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@329a1243{/jobs/job,null,AVAILABLE,@= Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@27f9e982{/jobs/job/json,null,AVAILA= BLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@37d3d232{/stages,null,AVAILABLE,@Sp= ark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@581d969c{/stages/json,null,AVAILABL= E,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@2b46a8c1{/stages/stage,null,AVAILAB= LE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@5851bd4f{/stages/stage/json,null,AV= AILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@2f40a43{/stages/pool,null,AVAILABLE= ,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@69c43e48{/stages/pool/json,null,AVA= ILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@3a80515c{/storage,null,AVAILABLE,@S= park}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@1c807b1d{/storage/json,null,AVAILAB= LE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@1b39fd82{/storage/rdd,null,AVAILABL= E,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@21680803{/storage/rdd/json,null,AVA= ILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@c8b96ec{/environment,null,AVAILABLE= ,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@2d8f2f3a{/environment/json,null,AVA= ILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@7048f722{/executors,null,AVAILABLE,= @Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@58a55449{/executors/json,null,AVAIL= ABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@6e0ff644{/executors/threadDump,null= ,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@2a2bb0eb{/executors/threadDump/json= ,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@2d0566ba{/static,null,AVAILABLE,@Sp= ark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@29d2d081{/,null,AVAILABLE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@58783f6c{/api,null,AVAILABLE,@Spark= }

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@88d6f9b{/jobs/job/kill,null,AVAILAB= LE,@Spark}

17/11/16 16:13:52 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@475b7792{/stages/stage/kill,null,AV= AILABLE,@Spark}

17/11/16 16:13:52 INFO ui.SparkUI: Bound SparkUI to = 0.0.0.0, and started at http= ://192.168.1.135:4040

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR file:/usr/hdp/2.4.3.0-227/hbase/lib/htrace-core-3.1.0-incubating.= jar at spark://192.168.1.135:42799/jars/htrace-core-3.1.0= -incubating.jar with timestamp 1510829032976

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR file:/usr/hdp/2.4.3.0-227/hbase/lib/metrics-core-2.2.0.jar at spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar = with timestamp 1510829032977

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR= file:/usr/hdp/2.4.3.0-227/hbase/lib/guava-12.0.1.jar at spark://192.168.1.135:42799/jars/guava-12.0.1.jar with time= stamp 1510829032977

17/11/16 16:13:52 INFO spark.SparkContext: Added JAR file:/usr/local/kylin/lib/kylin-job-2.2.0.jar at spark://192.16= 8.1.135:42799/jars/kylin-job-2.2.0.jar with timestamp 1510829032978

17/11/16 16:13:53 INFO executor.Executor: Starting e= xecutor ID driver on host localhost

17/11/16 16:13:53 INFO util.Utils: Successfully star= ted service 'org.apache.spark.network.netty.NettyBlockTransferSer= vice' on port 33164.

17/11/16 16:13:53 INFO netty.NettyBlockTransfer= Service: Server created on = 192.168.1.135:33164

17/11/16 16:13:53 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replic= ation policy

17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.135, 33164, None)=

17/11/16 16:13:53 INFO storage.BlockManagerMast= erEndpoint: Registering block manager 192.168.1.135:33164 with 366.3 MB RAM, BlockManagerId(driver, 192.168.1.135, 33164, None)

17/11/16 16:13:53 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.135, 33164, None)<= u>

17/11/16 16:13:53 INFO storage.BlockManager: Initial= ized BlockManager: BlockManagerId(driver, 192.168.1.135, 33164, None)<= /u>

17/11/16 16:13:53 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@1b9c1b51{/metrics/json,null,AVAILAB= LE,@Spark}

17/11/16 16:13:54 INFO scheduler.EventLoggingLi= stener: Logging events to hdfs:///kylin/spark-history/local-1510829033012

17/11/16 16:13:54 INFO common.AbstractHadoopJob: Rea= dy to load KylinConfig from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybd= hdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1c= c00dbb

17/11/16 16:13:54 INFO cube.CubeManager: Initializin= g CubeManager with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO persistence.ResourceStore: Us= ing metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

17/11/16 16:13:54 INFO hdfs.HDFSResourceStore: hdfs = meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO cube.CubeManager: Loading Cub= e from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd86= 7-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube

17/11/16 16:13:54 INFO cube.CubeDescManager: Initial= izing CubeDescManager with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO cube.CubeDescManager: Reloadi= ng Cube Metadata from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube_desc

17/11/16 16:13:54 INFO project.ProjectManager: Initi= alizing ProjectManager with metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: C= hecking custom measure types from kylin config

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering COUNT_DISTINCT(hllc), class org.apache.kylin.measure.hllc.HLLCMeasureType$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering COUNT_DISTINCT(bitmap), class org.apache.kylin.measure.bitmap.BitmapMeasureType$Factory<= u>

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering TOP_N(topn), class org.apache.kylin.measure.topn.TopNMeasureType$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering RAW(raw), class org.apache.kylin.measure.raw.RawMeasureTyp= e$Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering EXTENDED_COLUMN(extendedcolumn), class org.apache.kylin.measure.extendedcolumn.ExtendedColumnMeasureType= $Factory

17/11/16 16:13:54 INFO measure.MeasureTypeFactory: registering PERCENTILE(percentile), class org.apache.kylin.measure.percentile.PercentileMeasureType$Factory=

17/11/16 16:13:54 INFO metadata.MetadataManager: Rel= oading data model at /model_desc/test_sample_model.json

17/11/16 16:13:54 INFO cube.CubeDescManager: Loaded = 1 Cube(s)

17/11/16 16:13:54 INFO cube.CubeManager: Reloaded cu= be test_sample_cube being CUBE[name=3Dtest_sample_cube] having 1 segments

17/11/16 16:13:54 INFO cube.CubeManager: Loaded 1 cu= bes, fail on 0 cubes

17/11/16 16:13:54 INFO spark.SparkCubingByLayer: RDD= Output path: hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/test_sample_cube/cuboid/

17/11/16 16:13:55 INFO spark.SparkCubingByLayer: All= measure are normal (agg on all cuboids) ? : true

17/11/16 16:13:55 INFO internal.SharedState: loading= hive config file: file:/usr/local/spark/conf/hive-site.xml

17/11/16 16:13:55 INFO internal.SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set= . Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.di= r ('/apps/hive/warehouse').

17/11/16 16:13:55 INFO internal.SharedState: Warehou= se path is '/apps/hive/warehouse'.

17/11/16 16:13:55 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@75cf0de5{/SQL,null,AVAILABLE,@Spark= }

17/11/16 16:13:55 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@468173fa{/SQL/json,null,AVAILABLE,@= Spark}

17/11/16 16:13:55 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@27e2287c{/SQL/execution,null,AVAILA= BLE,@Spark}

17/11/16 16:13:55 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@2cd388f5{/SQL/execution/json,null,A= VAILABLE,@Spark}

17/11/16 16:13:55 INFO handler.ContextHandler: Start= ed o.s.j.s.ServletContextHandler@4207852d{/static/sql,null,AVAILABLE= ,@Spark}

17/11/16 16:13:56 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.

17/11/16 16:13:57 INFO hive.metastore: Trying to con= nect to metastore with URI thrift://master01.trinitymobility.local:9083=

17/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:57 INFO session.SessionState: Created= local directory: /tmp/58660d8b-48ac-4cf0-bd06-6b96018a5482_resources<= u>

17/11/16 16:13:57 INFO session.SessionState: Created= HDFS directory: /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482<= u>

17/11/16 16:13:57 INFO session.SessionState: Created= local directory: /tmp/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482

17/11/16 16:13:57 INFO session.SessionState: Created= HDFS directory: /tmp/hive/hdfs/58660d8b-48ac-4cf0-bd06-6b96018a5482/_tmp_space.db

17/11/16 16:13:57 INFO client.HiveClientImpl: Wareho= use location for Hive client (version 1.2.1) is /apps/hive/warehouse<= /u>

17/11/16 16:13:57 INFO sqlstd.SQLStdHiveAccessC= ontroller: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=3D58660d8b-48ac-4cf0-bd06-6b960= 18a5482, clientType=3DHIVECLI]

17/11/16 16:13:57 INFO hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl t= o org.apache.hadoop.hive.ql.security.authorization.plugin.Authoriza= tionMetaStoreFilterHook

17/11/16 16:13:57 INFO hive.metastore: Trying to con= nect to metastore with URI thrift://master01.trinitymobility.local:9083=

17/11/16 16:13:57 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.Authoriza= tionMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImp= l

17/11/16 16:13:58 INFO hive.metastore: Trying to con= nect to metastore with URI thrift://master01.trinitymobility.local:9083=

17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO session.SessionState: Created= local directory: /tmp/bd69eb21-01c1-4dd3-b31c-16e065ab4101_resources<= u>

17/11/16 16:13:58 INFO session.SessionState: Created= HDFS directory: /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101<= u>

17/11/16 16:13:58 INFO session.SessionState: Created= local directory: /tmp/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101

17/11/16 16:13:58 INFO session.SessionState: Created= HDFS directory: /tmp/hive/hdfs/bd69eb21-01c1-4dd3-b31c-16e065ab4101/_tmp_space.db

17/11/16 16:13:58 INFO client.HiveClientImpl: Wareho= use location for Hive client (version 1.2.1) is /apps/hive/warehouse<= /u>

17/11/16 16:13:58 INFO sqlstd.SQLStdHiveAccessC= ontroller: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=3Dbd69eb21-01c1-4dd3-b31c-16e06= 5ab4101, clientType=3DHIVECLI]

17/11/16 16:13:58 INFO hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl t= o org.apache.hadoop.hive.ql.security.authorization.plugin.Authoriza= tionMetaStoreFilterHook

17/11/16 16:13:58 INFO hive.metastore: Trying to con= nect to metastore with URI thrift://master01.trinitymobility.local:9083=

17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO state.StateStoreCoordina= torRef: Registered StateStoreCoordinator endpoint

17/11/16 16:13:58 INFO execution.SparkSqlParser: Par= sing command: default.kylin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2= ff_fc5f1cc00dbb

17/11/16 16:13:58 INFO hive.metastore: Trying to con= nect to metastore with URI thrift://master01.trinitymobility.local:9083=

17/11/16 16:13:58 INFO hive.metastore: Connected to metastore.

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: timestamp

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: double

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: double

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: string

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: boolean

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:13:58 INFO parser.CatalystSqlParser: Par= sing command: int

17/11/16 16:14:00 INFO memory.MemoryStore: Block bro= adcast_0 stored as values in memory (estimated size 373.5 KB, free 365.9 MB)<= u>

17/11/16 16:14:00 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 35.8 KB, free 365.9 MB)

17/11/16 16:14:00 INFO storage.BlockManagerInfo: Add= ed broadcast_0_piece0 in memory on 192.168.1.135:33164 (size: 35.8 KB, free: 366.3 MB)

17/11/16 16:14:00 INFO spark.SparkContext: Created b= roadcast 0 from

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46= -48f6-8531-47afa69bea83.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce= 7f-4553-9622-aeaf4fe878b6.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7= -6190-4951-afd7-163137f3988e.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7e= c0-4074-9a6f-05977b6e3260.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-= 42e4-4685-8a7a-0e1e4ef7dcd3.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58= -19b3-46cb-bcf7-64d1b7b10fe0.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2e= c9-4d28-81dc-fdac31787942.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-= f62c-4d20-a222-954e1c13b537.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477= 386-68e1-4e82-8852-ab9bf2a6a114.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4f= c1-9943-9db12084dea3.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4= 421-b390-7bc1693c3f09.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: Dicti= onaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933= -8229-46c0-bb82-fd0cf01bede5.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-= a2f9-403c-9b0b-64a7b03f8f84.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d= 33-8260-4f5a-ab01-6e5ac7152672.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44= d4-a85e-223ee929068d.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677= f-29b9-4952-a7a5-c0119e3da826.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3= -4537-8198-93bc6786a0e8.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092= -4d4c-83df-6e7511b0b57a.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55= -412e-b7d6-a2ff093aaf56.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-874= 5-8af522641496.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: Dicti= onaryManager(293019606) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8= ed-e18a5a4eba5a.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4= 912-8570-111fe71bd39d.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f= 6dc6e6db62d.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-4= 9c6-b640-92ad38ef16d0.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c40247= 26-85bb-484c-b5ca-4f1c2fb4dec0.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55= ee3ecbf.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9= a43099c1208.dict

17/11/16 16:14:01 INFO dict.DictionaryManager: DictionaryManager(293019606) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b= 72e-696c131dbe2d.dict

17/11/16 16:14:01 INFO common.CubeStatsReader: Estim= ating size for layer 0, all cuboids are 536870911, total size is 0.01019883155822= 7539

17/11/16 16:14:01 INFO spark.SparkCubingByLayer: Par= tition for spark cubing: 1

17/11/16 16:14:01 INFO output.FileOutputCommitter: F= ile Output Committer Algorithm version is 1

17/11/16 16:14:01 INFO spark.SparkContext: Starting = job: runJob at SparkHadoopMapReduceWriter.scala:88

17/11/16 16:14:01 INFO mapred.FileInputFormat: Total= input paths to process : 1

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Regis= tering RDD 6 (mapToPair at SparkCubingByLayer.java:170)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Got j= ob 0 (runJob at SparkHadoopMapReduceWriter.scala:88) with 1 output partitio= ns

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Final= stage: ResultStage 1 (runJob at SparkHadoopMapReduceWriter.scala:88)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Paren= ts of final stage: List(ShuffleMapStage 0)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Missi= ng parents: List(ShuffleMapStage 0)

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submi= tting ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170), which has no missing parents

17/11/16 16:14:01 INFO memory.MemoryStore: Block bro= adcast_1 stored as values in memory (estimated size 25.8 KB, free 365.9 MB)

17/11/16 16:14:01 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 10.7 KB, free = 365.9 MB)

17/11/16 16:14:01 INFO storage.BlockManagerInfo: Add= ed broadcast_1_piece0 in memory on 192.168.1.135:33164 (size: 10.7 KB, free: 366.3 MB)

17/11/16 16:14:01 INFO spark.SparkContext: Created b= roadcast 1 from broadcast at DAGScheduler.scala:1006

17/11/16 16:14:01 INFO scheduler.DAGScheduler: Submi= tting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[6] at mapToPair at SparkCubingByLayer.java:170) (first 15 tasks are for partitions Vector(0))<= u>

17/11/16 16:14:01 INFO scheduler.TaskSchedulerImpl: = Adding task set 0.0 with 1 tasks

17/11/16 16:14:02 INFO scheduler.TaskSetManager: Sta= rting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, ANY, 4978 bytes)

17/11/16 16:14:02 INFO executor.Executor: Running ta= sk 0.0 in stage 0.0 (TID 0)

17/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar = with timestamp 1510829032977

17/11/16 16:14:02 INFO client.TransportClientFactory= : Successfully created connection to /192.168.1.135:42799 after 64 ms (0 ms spent in bootstraps)

17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/jars/metrics-core-2.2.0.jar = to /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7= 729a-5561-48d1-bfd5-e3459b0dc20e/fetchFileTemp5518529699147501519.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/metrics-core-2.2.0.jar to class loader

17/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.168.1.135:42799/jars/guava-12.0.1.jar with timesta= mp 1510829032977

17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/jars/guava-12.0.1.jar to /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729= a-5561-48d1-bfd5-e3459b0dc20e/fetchFileTemp23683684527060930= 62.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/guava-12.0.1.jar to class loader

17/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.168.1.135:42799/jars/htrace-core-3.1.0= -incubating.jar with timestamp 1510829032976

17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/jars/htrace-core-3.1.0= -incubating.jar to /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729= a-5561-48d1-bfd5-e3459b0dc20e/fetchFileTemp45393749103399581= 67.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/htrace-core-3.1.0-incubating.j= ar to class loader

17/11/16 16:14:02 INFO executor.Executor: Fetching spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar with t= imestamp 1510829032978

17/11/16 16:14:02 INFO util.Utils: Fetching spark://192.168.1.135:42799/jars/kylin-job-2.2.0.jar to /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0f7729= a-5561-48d1-bfd5-e3459b0dc20e/fetchFileTemp90863940108896352= 70.tmp

17/11/16 16:14:02 INFO executor.Executor: Adding file:/tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6/userFiles-d0= f7729a-5561-48d1-bfd5-e3459b0dc20e/kylin-job-2.2.0.jar to class loader

17/11/16 16:14:02 INFO rdd.HadoopRDD: Input split: h= dfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e4= -9eea-814206fb79e3/kylin_intermediate_test_sample_cube_d4ccd867_e= 0ae_4ec2_b2ff_fc5f1cc00dbb/000000_0:0+19534

17/11/16 16:14:02 INFO zlib.ZlibFactory: Successfull= y loaded & initialized native-zlib library

17/11/16 16:14:02 INFO compress.CodecPool: Got brand= -new decompressor [.deflate]

17/11/16 16:14:02 INFO compress.CodecPool: Got brand= -new decompressor [.deflate]

17/11/16 16:14:02 INFO compress.CodecPool: Got brand= -new decompressor [.deflate]

17/11/16 16:14:02 INFO compress.CodecPool: Got brand= -new decompressor [.deflate]

17/11/16 16:14:03 INFO codegen.CodeGenerator: Code g= enerated in 251.01178 ms

17/11/16 16:14:03 INFO codegen.CodeGenerator: Code g= enerated in 55.530064 ms

17/11/16 16:14:03 INFO common.AbstractHadoopJob: Rea= dy to load KylinConfig from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 INFO cube.CubeManager: Initializin= g CubeManager with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdf= s/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00= dbb

17/11/16 16:14:03 INFO persistence.ResourceStore: Us= ing metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

17/11/16 16:14:03 INFO hdfs.HDFSResourceStore: hdfs = meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 INFO cube.CubeManager: Loading Cub= e from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube

17/11/16 16:14:03 INFO cube.CubeDescManager: Initial= izing CubeDescManager with config kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 INFO cube.CubeDescManager: Reloadi= ng Cube Metadata from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb/cube_desc

17/11/16 16:14:03 INFO project.ProjectManager: Initi= alizing ProjectManager with metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:03 WARN cachesync.Broadcaster: More t= han one singleton exist

17/11/16 16:14:03 WARN project.ProjectManager: More = than one singleton exist

17/11/16 16:14:03 INFO metadata.MetadataManager: Rel= oading data model at /model_desc/test_sample_model.json

17/11/16 16:14:03 WARN metadata.MetadataManager: Mor= e than one singleton exist, current keys: 1464031233,1545268424

17/11/16 16:14:03 INFO cube.CubeDescManager: Loaded = 1 Cube(s)

17/11/16 16:14:03 WARN cube.CubeDescManager: More th= an one singleton exist

17/11/16 16:14:03 INFO cube.CubeManager: Reloaded cu= be test_sample_cube being CUBE[name=3Dtest_sample_cube] having 1 segments

17/11/16 16:14:03 INFO cube.CubeManager: Loaded 1 cu= bes, fail on 0 cubes

17/11/16 16:14:03 WARN cube.CubeManager: More than o= ne singleton exist

17/11/16 16:14:03 WARN cube.CubeManager: type: class org.apache.kylin.common.KylinConfig reference: 1464031233

17/11/16 16:14:03 WARN cube.CubeManager: type: class org.apache.kylin.common.KylinConfig reference: 1545268424

17/11/16 16:14:03 WARN dict.DictionaryManager: More = than one singleton exist

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID/391cbd21-cc46= -48f6-8531-47afa69bea83.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_TYPE/0c92e610-ce= 7f-4553-9622-aeaf4fe878b6.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_SENSOR_ID/19c299e7= -6190-4951-afd7-163137f3988e.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NAME/f8564625-7e= c0-4074-9a6f-05977b6e3260.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DISTRESS_NUMBER/739191fd-= 42e4-4685-8a7a-0e1e4ef7dcd3.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ADDRESS/8a32cc58= -19b3-46cb-bcf7-64d1b7b10fe0.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DESC/cab12a4e-2e= c9-4d28-81dc-fdac31787942.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CAMERA_LOCATION/44c5ee32-= f62c-4d20-a222-954e1c13b537.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_ID_DISPLAY/ab477= 386-68e1-4e82-8852-ab9bf2a6a114.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LATITUDE/324bf6fe-8b11-4f= c1-9943-9db12084dea3.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_LONGITUDE/eb13b237-61a5-4= 421-b390-7bc1693c3f09.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_DETAILS/d5316933= -8229-46c0-bb82-fd0cf01bede5.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_INCIDENT_STATUS/4eff7287-= a2f9-403c-9b0b-64a7b03f8f84.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_STATUS_DESCRIPTION/15e49d= 33-8260-4f5a-ab01-6e5ac7152672.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_THE_GEOM/dfb959be-3710-44= d4-a85e-223ee929068d.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_POLICE_STATION_ID/ae15677= f-29b9-4952-a7a5-c0119e3da826.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_CATEGORY_ID/ab4a3960-ade3= -4537-8198-93bc6786a0e8.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/V_DEVICE_NAME/328f7642-2092= -4d4c-83df-6e7511b0b57a.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.V_ANALYST_INCIDENTS/CATEGORY_NAME/2c7d1eea-8a55= -412e-b7d6-a2ff093aaf56.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_TYPE_CODE/a23bdfee-d2eb-4b2d-874= 5-8af522641496.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE/6dc5b112-399a-43cd-a8= ed-e18a5a4eba5a.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_EVENT_TYPE_HINDI/67e76885-3299-4= 912-8570-111fe71bd39d.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_STATUS/5743476a-4ca9-4661-9c34-f= 6dc6e6db62d.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_TIME_TO_COMPLETE/be3cb393-9f8c-4= 9c6-b640-92ad38ef16d0.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_INCIDENT_TIME_TO_COMPLETE/c40247= 26-85bb-484c-b5ca-4f1c2fb4dec0.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: Dicti= onaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) at /dict/TRINITYICCC.INDEX_EVENT/IT_ID/f593c063-e1d4-4da6-a092-4de55= ee3ecbf.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_SOP_ID/f07c0a1e-133a-4a9c-8f05-9= a43099c1208.dict

17/11/16 16:14:03 INFO dict.DictionaryManager: DictionaryManager(1001935557) loading DictionaryInfo(loadDictObj:true) = at /dict/TRINITYICCC.INDEX_EVENT/IT_PRIORITY_ID/11208e17-c71d-42d0-b= 72e-696c131dbe2d.dict

17/11/16 16:14:04 INFO executor.Executor: Finished t= ask 0.0 in stage 0.0 (TID 0). 1347 bytes result sent to driver

17/11/16 16:14:04 INFO scheduler.TaskSetManager: Fin= ished task 0.0 in stage 0.0 (TID 0) in 2084 ms on localhost (executor driver) (1/= 1)

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: = Removed TaskSet 0.0, whose tasks have all completed, from pool

17/11/16 16:14:04 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (mapToPair at SparkCubingByLayer.java:170) finished in 2.= 118 s

17/11/16 16:14:04 INFO scheduler.DAGScheduler: looki= ng for newly runnable stages

17/11/16 16:14:04 INFO scheduler.DAGScheduler: runni= ng: Set()

17/11/16 16:14:04 INFO scheduler.DAGScheduler: waiti= ng: Set(ResultStage 1)

17/11/16 16:14:04 INFO scheduler.DAGScheduler: faile= d: Set()

17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submi= tting ResultStage 1 (MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:238), which has no missing parents

17/11/16 16:14:04 INFO memory.MemoryStore: Block bro= adcast_2 stored as values in memory (estimated size 82.3 KB, free 365.8 MB)

17/11/16 16:14:04 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 31.6 KB, free 365.8 MB)

17/11/16 16:14:04 INFO storage.BlockManagerInfo: Add= ed broadcast_2_piece0 in memory on 192.168.1.135:33164 (size: 31.6 KB, free: 366.2 MB)

17/11/16 16:14:04 INFO spark.SparkContext: Created b= roadcast 2 from broadcast at DAGScheduler.scala:1006

17/11/16 16:14:04 INFO scheduler.DAGScheduler: Submi= tting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[8] at mapToPair at SparkCubingByLayer.java:238) (first 15 tasks are for partitions Vector(0))<= u>

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: = Adding task set 1.0 with 1 tasks

17/11/16 16:14:04 INFO scheduler.TaskSetManager: Sta= rting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, ANY, 4621 bytes)

17/11/16 16:14:04 INFO executor.Executor: Running ta= sk 0.0 in stage 1.0 (TID 1)

17/11/16 16:14:04 INFO storage.ShuffleBlockFetc= herIterator: Getting 1 non-empty blocks out of 1 blocks

17/11/16 16:14:04 INFO storage.ShuffleBlockFetc= herIterator: Started 0 remote fetches in 6 ms

17/11/16 16:14:04 INFO memory.MemoryStore: Block rdd= _7_0 stored as bytes in memory (estimated size 49.2 KB, free 365.7 MB)=

17/11/16 16:14:04 INFO storage.BlockManagerInfo: Add= ed rdd_7_0 in memory on 192.168.1.135:33164 (size: 49.2 KB, free: 366.2 MB)

17/11/16 16:14:04 INFO output.FileOutputCommitter: F= ile Output Committer Algorithm version is 1

17/11/16 16:14:04 INFO output.FileOutputCommitter: F= ile Output Committer Algorithm version is 1

17/11/16 16:14:04 INFO common.AbstractHadoopJob: Rea= dy to load KylinConfig from uri: kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:04 INFO cube.CubeDescManager: Initial= izing CubeDescManager with config kylin_metadata@hdfs,path=3Dhdfs://trinityb= dhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1= cc00dbb

17/11/16 16:14:04 INFO persistence.ResourceStore: Us= ing metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb for resource store

17/11/16 16:14:04 INFO hdfs.HDFSResourceStore: hdfs = meta path : hdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/d4ccd867-e0ae-= 4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:04 INFO cube.CubeDescManager: Reloadi= ng Cube Metadata from folder hdfs://trinitybdhdfs/kylin/kylin_metadata/metadat= a/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb/cube_desc

17/11/16 16:14:04 INFO project.ProjectManager: Initi= alizing ProjectManager with metadata url kylin_metadata@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metad= ata/metadata/d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb

17/11/16 16:14:04 WARN cachesync.Broadcaster: More t= han one singleton exist

17/11/16 16:14:04 WARN project.ProjectManager: More = than one singleton exist

17/11/16 16:14:04 INFO metadata.MetadataManager: Rel= oading data model at /model_desc/test_sample_model.json

17/11/16 16:14:04 WARN metadata.MetadataManager: Mor= e than one singleton exist, current keys: 1464031233,1545268424,1474775600=

17/11/16 16:14:04 INFO cube.CubeDescManager: Loaded = 1 Cube(s)

17/11/16 16:14:04 WARN cube.CubeDescManager: More th= an one singleton exist

17/11/16 16:14:04 INFO output.FileOutputCommitter: S= aved output of task 'attempt_20171116161401_0001_r_000000_0' to hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/test_sample_cube/cuboid/level_base_cuboid/_te= mporary/0/task_20171116161401_0001_r_000000

17/11/16 16:14:04 INFO mapred.SparkHadoopMapRedUtil: attempt_20171116161401_0001_r_000000_0: Committed

17/11/16 16:14:04 ERROR executor.Executor: Exception= in task 0.0 in stage 1.0 (TID 1)

java.lang.IllegalArgumentException: Class is no= t registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

Note: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

17/11/16 16:14:04 WARN scheduler.TaskSetManager: Los= t task 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

Note: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

=C2=A0

17/11/16 16:14:04 ERROR scheduler.TaskSetManager: Ta= sk 0 in stage 1.0 failed 1 times; aborting job

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: = Removed TaskSet 1.0, whose tasks have all completed, from pool

17/11/16 16:14:04 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1

17/11/16 16:14:04 INFO scheduler.DAGScheduler: Resul= tStage 1 (runJob at SparkHadoopMapReduceWriter.scala:88) failed in 0.393 s due = to Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most rece= nt failure: Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

Note: To register this class use: kryo.register(org.= apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage.class);

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

=C2=A0

Driver stacktrace:

17/11/16 16:14:04 INFO scheduler.DAGScheduler: Job 0= failed: runJob at SparkHadoopMapReduceWriter.scala:88, took 3.135125 s<= u>

17/11/16 16:14:04 ERROR io.SparkHadoopMapReduceWrite= r: Aborting job job_20171116161401_0008.

org.apache.spark.SparkException: Job aborted du= e to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task= 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

Note: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

=C2=A0

Driver stacktrace:

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGSchedule= r.scala:1499)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1487)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1486)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ResizableArray$class.foreach(ResizableAr= ray.scala:59)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48= )

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.s= cala:1486)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.Option.foreach(Option.scala:257)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(= DAGScheduler.scala:814)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnRecei= ve(DAGScheduler.scala:1714)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1669)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1658)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala= :630)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.internal.i= o.SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduce= Writer.scala:88)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:151)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:112)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(<= wbr>PairRDDFunctions.scala:1084)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(<= wbr>JavaPairRDD.scala:831)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(Spark= CubingByLayer.java:238)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCub= ingByLayer.java:192)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.AbstractApplication.execute(Abstract= Application.java:37)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)<= u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce= ssorImpl.java:62)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:498)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$Spar= kSubmit$$runMain(SparkSubmit.scala:755)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scal= a:180)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205= )

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)<= u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.IllegalArgumentException: = Class is not registered: org.apache.spar= k.internal.io.FileCommitProtocol$TaskCommitMessage<= /u>

Note: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

Exception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.AbstractApplication.execute(Abstract= Application.java:42)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)<= u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce= ssorImpl.java:62)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.reflect.Method.invoke(Method.java:498)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$Spar= kSubmit$$runMain(SparkSubmit.scala:755)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scal= a:180)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205= )

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)<= u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: org.apache.spark.SparkException: Job= aborted.

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.internal.i= o.SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduce= Writer.scala:107)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply$mcV$sp(PairRDDFunctions.scala:1085)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoop= Dataset$1.apply(PairRDDFunctions.scala:1085)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:151)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationSco= pe.scala:112)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(<= wbr>PairRDDFunctions.scala:1084)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(<= wbr>JavaPairRDD.scala:831)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.saveToHDFS(Spark= CubingByLayer.java:238)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCub= ingByLayer.java:192)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.AbstractApplication.execute(Abstract= Application.java:37)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ... 10 more

Caused by: org.apache.spark.SparkException: Job= aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: = Lost task 0.0 in stage 1.0 (TID 1, localhost, executor driver): java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.i= o.FileCommitProtocol$TaskCommitMessage

Note: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

=C2=A0

Driver stacktrace:

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGSchedule= r.scala:1499)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1487)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.app= ly(DAGScheduler.scala:1486)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ResizableArray$class.foreach(ResizableAr= ray.scala:59)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48= )

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.s= cala:1486)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFai= led$1.apply(DAGScheduler.scala:814)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at scala.Option.foreach(Option.scala:257)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(= DAGScheduler.scala:814)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnRecei= ve(DAGScheduler.scala:1714)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1669)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive= (DAGScheduler.scala:1658)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala= :630)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2075)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.internal.i= o.SparkHadoopMapReduceWriter$.write(SparkHadoopMapReduce= Writer.scala:88)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ... 21 more

Caused by: java.lang.IllegalArgumentException: = Class is not registered: org.apache.spar= k.internal.io.FileCommitProtocol$TaskCommitMessage<= /u>

Note: To register this class use: kryo.register(org.apache.spark.in= ternal.io.FileCommitProtocol$TaskCommitMessage.class);=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:488)<= /u>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:97)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:517)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:622)=

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.serializer.KryoSerializerInstance.serialize(Kryo= Serializer.scala:315)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:= 383)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

17/11/16 16:14:04 INFO spark.SparkContext: Invoking = stop() from shutdown hook

17/11/16 16:14:04 INFO server.AbstractConnector: Sto= pped Spark@60bdda65{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}

17/11/16 16:14:04 INFO ui.SparkUI: Stopped Spark web= UI at http://192.168.1.13= 5:4040

17/11/16 16:14:04 INFO spark.MapOutputTrackerMa= sterEndpoint: MapOutputTrackerMasterEndpoint stopped!

17/11/16 16:14:04 INFO memory.MemoryStore: MemorySto= re cleared

17/11/16 16:14:04 INFO storage.BlockManager: BlockMa= nager stopped

17/11/16 16:14:04 INFO storage.BlockManagerMaster: BlockManagerMaster stopped

17/11/16 16:14:04 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoin<= wbr>t: OutputCommitCoordinator stopped!

17/11/16 16:14:04 INFO spark.SparkContext: Successfu= lly stopped SparkContext

17/11/16 16:14:04 INFO util.ShutdownHookManager: Shu= tdown hook called

17/11/16 16:14:04 INFO util.ShutdownHookManager: Del= eting directory /tmp/spark-1baf8c03-622c-4406-9dd6-13db862ef4b6

The command is:

export HADOOP_CONF_DIR=3D/usr/local/kylin/hadoo= p-conf && /usr/local/kylin/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry=C2=A0 --conf spark.executor.instances=3D1=C2=A0 --conf spark.yarn.archive=3Dhdfs://trinitybdhdfs/kylin/spark/spark-libs.= jar=C2=A0 --conf spark.yarn.queue=3Ddefault=C2=A0 --conf sp= ark.yarn.am.extraJavaOptions=3D-Dhdp.version=3D2.4.3.0-227=C2= =A0 --conf spark.history.fs.logDirectory=3Dhdfs:///kylin/spark-history=C2= =A0 --conf spark.driver.extraJavaOptions=3D-Dhdp.version=3D2.4.3.0-227=C2=A0 --co= nf spark.master=3Dlocal[*]=C2=A0 --conf spark.executor.extraJavaOptions= =3D-Dhdp.version=3D2.4.3.0-227=C2=A0 --conf spark.hadoop.yarn.timeline-service.enabled=3Dfalse=C2=A0 --conf spark.executor.memory=3D1G=C2=A0 --conf spark.eventLog.enabled=3Dtrue=C2=A0= --conf spark.eventLog.dir=3Dhdfs:///kylin/spark-history=C2=A0 --conf spark.executor.cores=3D2 --jars /usr/hdp/2.4.3.0-227/hbase/lib/htrace-= core-3.1.0-incubating.jar,/usr/hdp/2.4.3.0-227/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.3.0-227= /hbase/lib/guava-12.0.1.jar, /usr/local/kylin/lib/kylin-job-2.2.0.jar -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.ky= lin_intermediate_test_sample_cube_d4ccd867_e0ae_4ec2_b2ff_fc5f1cc= 00dbb -output hdfs://trinitybdhdfs/kylin/kylin_metadata/kylin-26342fa2-68ac-48e= 4-9eea-814206fb79e3/test_sample_cube/cuboid/ -segmentId d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -metaUrl kylin_metadat= a@hdfs,path=3Dhdfs://trinitybdhdfs/kylin/kylin_metadata/metadata/= d4ccd867-e0ae-4ec2-b2ff-fc5f1cc00dbb -cubename test_sample_cube

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.common.util.CliCommandExecutor.execute(CliComman= dExecutor.java:92)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.engine.spark.SparkExecutable.doWork(SparkExecuta= ble.java:152)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.job.execution.AbstractExecutable.execute(Abstrac= tExecutable.java:125)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.job.execution.DefaultChainedExecutable.doWo= rk(DefaultChainedExecutable.java:64)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.job.execution.AbstractExecutable.execute(Abstrac= tExecutable.java:125)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.r= un(DefaultScheduler.java:144)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecu= tor.java:1142)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:617)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at java.lang.Thread.run(Thread.java:745)

2017-11-16 16:14:05,169 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:= 421 : job id:26342fa2-68ac-48e4-9eea-814206fb79e3-06 from RUNNING to ERROR

2017-11-16 16:14:05,217 INFO=C2=A0 [Scheduler 121109= 8754 Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.ExecutableManager:= 421 : job id:26342fa2-68ac-48e4-9eea-814206fb79e3 from RUNNING to ERROR

2017-11-16 16:14:05,217 DEBUG [Scheduler 1211098754 = Job 26342fa2-68ac-48e4-9eea-814206fb79e3-689] execution.AbstractExecutable= :259 : no need to send email, user list is empty

2017-11-16 16:14:05,226 INFO=C2=A0 [pool-8-thread-1] threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 0 act= ual running, 0 stopped, 0 ready, 3 already succeed, 1 error, 1 discarded, 0 oth= ers

2017-11-16 16:14:16,344 INFO=C2=A0 [pool-8-thread-1] threadpool.DefaultScheduler:123 : Job Fetcher: 0 should running, 0 act= ual running, 0 stopped, 0 ready, 3 already succeed, 1 error, 1 discarded, 0 oth= ers

=C2=A0

=C2=A0




--
Best regards,

Shaofeng Shi =E5=8F=B2= =E5=B0=91=E9=94=8B

--94eb2c1945161ca3c7055e199213--