Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id CFE54200B53 for ; Tue, 12 Jul 2016 17:41:40 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id CE573160A56; Tue, 12 Jul 2016 15:41:40 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 81E41160A53 for ; Tue, 12 Jul 2016 17:41:38 +0200 (CEST) Received: (qmail 53447 invoked by uid 500); 12 Jul 2016 15:41:37 -0000 Mailing-List: contact user-help@kylin.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@kylin.apache.org Delivered-To: mailing list user@kylin.apache.org Received: (qmail 53437 invoked by uid 99); 12 Jul 2016 15:41:37 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 12 Jul 2016 15:41:37 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id E70DA18397F for ; Tue, 12 Jul 2016 15:41:36 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.668 X-Spam-Level: ** X-Spam-Status: No, score=2.668 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FILL_THIS_FORM=0.001, HTML_MESSAGE=2, LOTS_OF_MONEY=0.001, MONEY_FORM=1.485, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (1024-bit key) header.d=badrit.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id g9ZDflytTqBc for ; Tue, 12 Jul 2016 15:41:26 +0000 (UTC) Received: from mail-wm0-f53.google.com (mail-wm0-f53.google.com [74.125.82.53]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id B956B5FD67 for ; Tue, 12 Jul 2016 15:41:19 +0000 (UTC) Received: by mail-wm0-f53.google.com with SMTP id f126so132084654wma.1 for ; Tue, 12 Jul 2016 08:41:19 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=badrit.com; s=google; h=mime-version:from:date:message-id:subject:to; bh=siPAcHNFAybksQFOUdyjtdbQlnLbEKd11cLNfS7ARJg=; b=WwjxSg508p+Dkd9M74y/qmfNN3O8/T1xut2PaNTtMxS/mQXggF4oqjtJSGEdNfufYx SAzJ1rZhXoUWXHNEg0xhEwpWweias8oVIXwZW7vvdZ6Js+q6rgqTvOZePFmYyXD35Ras fZu/JtFxj/SMWSTp+h14QfcCHyAi7Nqf3tf0o= X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:from:date:message-id:subject:to; bh=siPAcHNFAybksQFOUdyjtdbQlnLbEKd11cLNfS7ARJg=; b=YTssW8iUZeHLBXNOpQf0ctmHGMx1B6zkTvOoI8TY1Lz1GdFpUDpiK8PwBvOaeyg18y MulRIlfwek5PLBopVT+yvZ55pHAuwM+Pklcyleytnc3bQ+XNz1qsbbQR2t2OjKj5Ix09 JzR+er8OfUI8ndxd4hZRgE3zDrlBNkTasStmhxI0DGC6uyQIgL2PAyZ7eH//8pv9RG0f 2/i6zGX6PjE+otIxj1NZo3FA+JqMizIv7OS1DZXCvC+04aBPCfmlWZzD1dRUPIrA0zSU Dm8GkHJ8AZNfbp6hhkDc4l7lFGux5n0+PYpUEqOyY3ovb5jy+FMDOeZgfT62dVxxej6P SsbQ== X-Gm-Message-State: ALyK8tJVNsfA0qv54f2Nxgo4WWqAPXBYx7KsDq6lLIbBiOiF92KcIvR9y6gOy+NwgSSAeDxTfDvmJw6SmaAKLA== X-Received: by 10.28.232.149 with SMTP id f21mr4114822wmi.51.1468338069816; Tue, 12 Jul 2016 08:41:09 -0700 (PDT) MIME-Version: 1.0 Received: by 10.194.62.144 with HTTP; Tue, 12 Jul 2016 08:40:50 -0700 (PDT) From: Ahmed Mahran Date: Tue, 12 Jul 2016 17:40:50 +0200 Message-ID: Subject: Timeout visiting cube! To: user@kylin.apache.org Content-Type: multipart/alternative; boundary=001a1147c9ae0fe25d0537721935 archived-at: Tue, 12 Jul 2016 15:41:41 -0000 --001a1147c9ae0fe25d0537721935 Content-Type: text/plain; charset=UTF-8 Hi, Kylin 1.5.2.1-cdh5.7 Hadoop 2.6.0-cdh5.7.1 Hive 1.1.0-cdh5.7.1 HBase 1.2.0-cdh5.7.1 kylin.log 2016-07-12 08:10:03,894 ERROR [pool-5-thread-1] dao.ExecutableDao:145 : error get all Jobs: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:316) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160) at org.apache.hadoop.hbase.client.ClientScanner.(ClientScanner.java:155) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867) at org.apache.kylin.storage.hbase.HBaseResourceStore.visitFolder(HBaseResourceStore.java:137) at org.apache.kylin.storage.hbase.HBaseResourceStore.listResourcesImpl(HBaseResourceStore.java:107) at org.apache.kylin.common.persistence.ResourceStore.listResources(ResourceStore.java:123) at org.apache.kylin.job.dao.ExecutableDao.getJobIds(ExecutableDao.java:135) at org.apache.kylin.job.manager.ExecutableManager.getAllJobIds(ExecutableManager.java:204) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$FetcherRunner.run(DefaultScheduler.java:81) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:416) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:722) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070) at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199) at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305) ... 21 more 2016-07-12 08:10:03,900 ERROR [pool-5-thread-1] manager.ExecutableManager:206 : error get All Job Ids org.apache.kylin.job.exception.PersistentException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location at org.apache.kylin.job.dao.ExecutableDao.getJobIds(ExecutableDao.java:146) at org.apache.kylin.job.manager.ExecutableManager.getAllJobIds(ExecutableManager.java:204) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$FetcherRunner.run(DefaultScheduler.java:81) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:316) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160) at org.apache.hadoop.hbase.client.ClientScanner.(ClientScanner.java:155) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867) at org.apache.kylin.storage.hbase.HBaseResourceStore.visitFolder(HBaseResourceStore.java:137) at org.apache.kylin.storage.hbase.HBaseResourceStore.listResourcesImpl(HBaseResourceStore.java:107) at org.apache.kylin.common.persistence.ResourceStore.listResources(ResourceStore.java:123) at org.apache.kylin.job.dao.ExecutableDao.getJobIds(ExecutableDao.java:135) ... 9 more Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:416) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:722) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070) at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199) at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305) ... 21 more 2016-07-12 08:10:03,901 WARN [pool-5-thread-1] threadpool.DefaultScheduler:108 : Job Fetcher caught a exception java.lang.RuntimeException: org.apache.kylin.job.exception.PersistentException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location 2016-07-12 08:10:11,955 INFO [http-bio-7070-exec-7] controller.QueryController:174 : Using project: CDR_Demo_Project 2016-07-12 08:10:11,958 INFO [http-bio-7070-exec-7] controller.QueryController:175 : The original query: select * from CDR_AGGREGATION 2016-07-12 08:10:11,960 INFO [http-bio-7070-exec-7] service.QueryService:266 : The corrected query: select * from CDR_AGGREGATION LIMIT 50000 2016-07-12 08:10:12,053 INFO [http-bio-7070-exec-7] routing.QueryRouter:48 : The project manager's reference is org.apache.kylin.metadata.project.ProjectManager@5759a069 2016-07-12 08:10:12,054 INFO [http-bio-7070-exec-7] routing.QueryRouter:60 : Find candidates by table DEFAULT.CDR_AGGREGATION and project=CDR_DEMO_PROJECT : org.apache.kylin.query.routing.Candidate@44071cfe 2016-07-12 08:10:12,055 INFO [http-bio-7070-exec-7] routing.QueryRouter:49 : Applying rule: class org.apache.kylin.query.routing.rules.RemoveUncapableRealizationsRule, realizations before: [CDR_Demo_Cube_1(CUBE)], realizations after: [CDR_Demo_Cube_1(CUBE)] 2016-07-12 08:10:12,055 INFO [http-bio-7070-exec-7] routing.QueryRouter:49 : Applying rule: class org.apache.kylin.query.routing.rules.RealizationSortRule, realizations before: [CDR_Demo_Cube_1(CUBE)], realizations after: [CDR_Demo_Cube_1(CUBE)] 2016-07-12 08:10:12,056 INFO [http-bio-7070-exec-7] routing.QueryRouter:72 : The realizations remaining: [CDR_Demo_Cube_1(CUBE)] And the final chosen one is the first one 2016-07-12 08:10:12,077 DEBUG [http-bio-7070-exec-7] enumerator.OLAPEnumerator:107 : query storage... 2016-07-12 08:10:12,078 INFO [http-bio-7070-exec-7] enumerator.OLAPEnumerator:181 : No group by and aggregation found in this query, will hack some result for better look of output... 2016-07-12 08:10:12,078 INFO [http-bio-7070-exec-7] v2.CubeStorageQuery:239 : exactAggregation is true 2016-07-12 08:10:12,079 INFO [http-bio-7070-exec-7] v2.CubeStorageQuery:357 : Enable limit 50000 2016-07-12 08:10:12,087 DEBUG [http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC:257 : New scanner for current segment CDR_Demo_Cube_1[20150101000000_20160101000000] will use SCAN_FILTER_AGGR_CHECKMEM as endpoint's behavior 2016-07-12 08:10:12,088 DEBUG [http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC:313 : Serialized scanRequestBytes 660 bytes, rawScanBytesString 106 bytes 2016-07-12 08:10:12,088 INFO [http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC:315 : The scan 416215cb for segment CDR_Demo_Cube_1[20150101000000_20160101000000] is as below with 1 separate raw scans, shard part of start/end key is set to 0 2016-07-12 08:10:12,090 INFO [http-bio-7070-exec-7] v2.CubeHBaseRPC:271 : Visiting hbase table KYLIN_HX9PP90NMQ: cuboid exact match, from 15 to 15 Start: \x00\x00\x00\x00\x00\x00\x00\x00\x00\x0F\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00 (\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0F\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00) Stop: \x00\x00\x00\x00\x00\x00\x00\x00\x00\x0F\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\x00 (\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0F\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\x00), No Fuzzy Key 2016-07-12 08:10:12,090 DEBUG [http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC:320 : Submitting rpc to 2 shards starting from shard 0, scan range count 1 2016-07-12 08:10:12,163 INFO [http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC:103 : Timeout for ExpectedSizeIterator is: 66000 2016-07-12 08:10:12,164 DEBUG [http-bio-7070-exec-7] enumerator.OLAPEnumerator:127 : return TupleIterator... 2016-07-12 08:10:45,430 ERROR [pool-11-thread-5] util.LoggableCachedThreadPool:44 : Execution exception when running task in pool-11-thread-5 2016-07-12 08:10:45,432 ERROR [pool-11-thread-5] util.LoggableCachedThreadPool:54 : Caught exception in thread pool-11-thread-5: java.lang.RuntimeException: Error when visiting cubes by endpoint at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC$1.run(CubeHBaseEndpointRPC.java:345) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:416) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:722) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070) at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1179) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1136) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:971) at org.apache.hadoop.hbase.client.HRegionLocator.getRegionLocation(HRegionLocator.java:83) at org.apache.hadoop.hbase.client.HTable.getRegionLocation(HTable.java:569) at org.apache.hadoop.hbase.client.HTable.getKeysAndRegionsInRange(HTable.java:793) at org.apache.hadoop.hbase.client.HTable.getKeysAndRegionsInRange(HTable.java:763) at org.apache.hadoop.hbase.client.HTable.getStartKeysInRange(HTable.java:1830) at org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1785) at org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1765) at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC.getResults(CubeHBaseEndpointRPC.java:389) at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC.access$200(CubeHBaseEndpointRPC.java:75) at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC$1.run(CubeHBaseEndpointRPC.java:343) ... 5 more 2016-07-12 08:11:03,839 ERROR [pool-5-thread-1] dao.ExecutableDao:145 : error get all Jobs: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:316) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160) at org.apache.hadoop.hbase.client.ClientScanner.(ClientScanner.java:155) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867) at org.apache.kylin.storage.hbase.HBaseResourceStore.visitFolder(HBaseResourceStore.java:137) at org.apache.kylin.storage.hbase.HBaseResourceStore.listResourcesImpl(HBaseResourceStore.java:107) at org.apache.kylin.common.persistence.ResourceStore.listResources(ResourceStore.java:123) at org.apache.kylin.job.dao.ExecutableDao.getJobIds(ExecutableDao.java:135) at org.apache.kylin.job.manager.ExecutableManager.getAllJobIds(ExecutableManager.java:204) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$FetcherRunner.run(DefaultScheduler.java:81) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:416) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:722) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070) at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199) at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305) ... 21 more 2016-07-12 08:11:03,842 ERROR [pool-5-thread-1] manager.ExecutableManager:206 : error get All Job Ids org.apache.kylin.job.exception.PersistentException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location at org.apache.kylin.job.dao.ExecutableDao.getJobIds(ExecutableDao.java:146) at org.apache.kylin.job.manager.ExecutableManager.getAllJobIds(ExecutableManager.java:204) at org.apache.kylin.job.impl.threadpool.DefaultScheduler$FetcherRunner.run(DefaultScheduler.java:81) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:316) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160) at org.apache.hadoop.hbase.client.ClientScanner.(ClientScanner.java:155) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867) at org.apache.kylin.storage.hbase.HBaseResourceStore.visitFolder(HBaseResourceStore.java:137) at org.apache.kylin.storage.hbase.HBaseResourceStore.listResourcesImpl(HBaseResourceStore.java:107) at org.apache.kylin.common.persistence.ResourceStore.listResources(ResourceStore.java:123) at org.apache.kylin.job.dao.ExecutableDao.getJobIds(ExecutableDao.java:135) ... 9 more Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:416) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:722) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873) at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242) at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226) at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331) at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070) at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199) at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305) ... 21 more 2016-07-12 08:11:03,843 WARN [pool-5-thread-1] threadpool.DefaultScheduler:108 : Job Fetcher caught a exception java.lang.RuntimeException: org.apache.kylin.job.exception.PersistentException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location 2016-07-12 08:11:18,165 ERROR [http-bio-7070-exec-7] controller.QueryController:209 : Exception when execute sql java.sql.SQLException: Error while executing SQL "select * from CDR_AGGREGATION LIMIT 50000": Timeout visiting cube! at org.apache.calcite.avatica.Helper.createException(Helper.java:56) at org.apache.calcite.avatica.Helper.createException(Helper.java:41) at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:143) at org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:186) at org.apache.kylin.rest.service.QueryService.execute(QueryService.java:361) at org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:273) at org.apache.kylin.rest.service.QueryService.query(QueryService.java:121) at org.apache.kylin.rest.service.QueryService$$FastClassByCGLIB$$4957273f.invoke() at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:618) at org.apache.kylin.rest.service.QueryService$$EnhancerByCGLIB$$2b43fc30.query() at org.apache.kylin.rest.controller.QueryController.doQueryWithCache(QueryController.java:192) at org.apache.kylin.rest.controller.QueryController.query(QueryController.java:94) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:789) at javax.servlet.http.HttpServlet.service(HttpServlet.java:646) at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:201) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.ui.DefaultLoginPageGeneratingFilter.doFilter(DefaultLoginPageGeneratingFilter.java:91) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192) at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160) at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346) at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:195) at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:266) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:504) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:421) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1074) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:314) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.RuntimeException: Timeout visiting cube! at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC$ExpectedSizeIterator.next(CubeHBaseEndpointRPC.java:127) at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC$ExpectedSizeIterator.next(CubeHBaseEndpointRPC.java:81) at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:48) at com.google.common.collect.Iterators$6.hasNext(Iterators.java:583) at org.apache.kylin.storage.hbase.cube.v2.SequentialCubeTupleIterator.hasNext(SequentialCubeTupleIterator.java:96) at org.apache.kylin.query.enumerator.OLAPEnumerator.moveNext(OLAPEnumerator.java:74) at org.apache.calcite.linq4j.EnumerableDefaults$TakeWhileEnumerator.moveNext(EnumerableDefaults.java:2818) at org.apache.calcite.linq4j.Linq4j$EnumeratorIterator.(Linq4j.java:664) at org.apache.calcite.linq4j.Linq4j.enumeratorIterator(Linq4j.java:98) at org.apache.calcite.linq4j.AbstractEnumerable.iterator(AbstractEnumerable.java:33) at org.apache.calcite.avatica.MetaImpl.createCursor(MetaImpl.java:85) at org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:190) at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:65) at org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44) at org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:566) at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:578) at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:571) at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:135) ... 80 more 2016-07-12 08:11:18,173 INFO [http-bio-7070-exec-7] service.QueryService:250 : ==========================[QUERY]=============================== SQL: select * from CDR_AGGREGATION User: ADMIN Success: false Duration: 0.0 Project: CDR_Demo_Project Realization Names: [CDR_Demo_Cube_1] Cuboid Ids: [15] Total scan count: 0 Result row count: 0 Accept Partial: true Is Partial Result: false Hit Exception Cache: false Storage cache used: false Message: Error while executing SQL "select * from CDR_AGGREGATION LIMIT 50000": Timeout visiting cube! ==========================[QUERY]=============================== 2016-07-12 08:11:18,174 ERROR [http-bio-7070-exec-7] controller.BasicController:44 : org.apache.kylin.rest.exception.InternalErrorException: Error while executing SQL "select * from CDR_AGGREGATION LIMIT 50000": Timeout visiting cube! at org.apache.kylin.rest.controller.QueryController.doQueryWithCache(QueryController.java:224) at org.apache.kylin.rest.controller.QueryController.query(QueryController.java:94) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:789) at javax.servlet.http.HttpServlet.service(HttpServlet.java:646) at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:201) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.ui.DefaultLoginPageGeneratingFilter.doFilter(DefaultLoginPageGeneratingFilter.java:91) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:183) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192) at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160) at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346) at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:195) at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:266) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:504) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:421) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1074) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:314) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:745) --001a1147c9ae0fe25d0537721935 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,

Kylin 1.5.2.1-cdh5.7
Hadoop 2.6= .0-cdh5.7.1
Hive 1.1.0-cdh5.7.1
HBase 1.2.= 0-cdh5.7.1

kylin.log

2016-= 07-12 08:10:03,894 ERROR [pool-5-thread-1] dao.ExecutableDao:145 : error ge= t all Jobs:
org.apache.hadoop.hbase.clien= t.RetriesExhaustedException: Can't get the location
at org.ap= ache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocatio= ns(RpcRetryingCallerWithReadReplicas.java:316)
at org.apache.hado= op.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplica= s.java:156)
at org.apache.hadoop.hbase.client.ScannerCallableWith= Replicas.call(ScannerCallableWithReplicas.java:60)
at org.apache.= hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.= java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(Cl= ientScanner.java:320)
at org.apache.hadoop.hbase.client.ClientSca= nner.nextScanner(ClientScanner.java:295)
= at org.apache.hadoop.hba= se.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:= 160)
at org.apache.hadoop.hbase.client.ClientScanner.<init>= (ClientScanner.java:155)
at org.apache.hadoop.hbase.client.HTable= .getScanner(HTable.java:867)
at org.apache.kylin.storage.hbase.H= BaseResourceStore.visitFolder(HBaseResourceStore.java:137)
at or= g.apache.kylin.storage.hbase.HBaseResourceStore.listResourcesImpl(HBaseReso= urceStore.java:107)
at org.apache.kylin.common.persistence.Resour= ceStore.listResources(ResourceStore.java:123)
at org.apache.kylin= .job.dao.ExecutableDao.getJobIds(ExecutableDao.java:135)
at org.a= pache.kylin.job.manager.ExecutableManager.getAllJobIds(ExecutableManager.ja= va:204)
at org.apache.kylin.job.impl.threadpool.DefaultScheduler$= FetcherRunner.run(DefaultScheduler.java:81)
at java.util.concurre= nt.Executors$RunnableAdapter.call(Executors.java:511)
at java.uti= l.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.= util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(= ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.Sch= eduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecuto= r.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(= ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolE= xecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thre= ad.run(Thread.java:745)
Caused by: java.n= et.ConnectException: Connection refused
<= span class=3D"" style=3D"white-space:pre"> at sun.nio.ch.SocketChann= elImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImp= l.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.= net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at = org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
at or= g.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
at org.ap= ache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImp= l.java:416)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connecti= on.setupIOstreams(RpcClientImpl.java:722)
at org.apache.hadoop.hb= ase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
=
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteReq= uest(RpcClientImpl.java:873)
at org.apache.hadoop.hbase.ipc.RpcC= lientImpl.call(RpcClientImpl.java:1242)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.hadoop.hbas= e.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
<= /span>at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelIm= plementation.callBlockingMethod(AbstractRpcClient.java:331)
at or= g.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$Blockin= gStub.get(ClientProtos.java:34070)
at org.apache.hadoop.hbase.pro= tobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582)
at or= g.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.lo= cateRegionInMeta(ConnectionManager.java:1398)
at org.apache.hadoo= p.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(Con= nectionManager.java:1199)
at org.apache.hadoop.hbase.client.RpcRe= tryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadRe= plicas.java:305)
... 21 more
= 2016-07-12 08:10:03,900 ERROR [pool-5-thread-1] manager.ExecutableManager:2= 06 : error get All Job Ids
org.apache.kyl= in.job.exception.PersistentException: org.apache.hadoop.hbase.client.Retrie= sExhaustedException: Can't get the location
at org.apache.kyl= in.job.dao.ExecutableDao.getJobIds(ExecutableDao.java:146)
at or= g.apache.kylin.job.manager.ExecutableManager.getAllJobIds(ExecutableManager= .java:204)
at org.apache.kylin.job.impl.threadpool.DefaultSchedul= er$FetcherRunner.run(DefaultScheduler.java:81)
at java.util.concu= rrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.= util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at ja= va.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$3= 01(ScheduledThreadPoolExecutor.java:180)
= at java.util.concurrent.= ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExec= utor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWork= er(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPo= olExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.T= hread.run(Thread.java:745)
Caused by: org= .apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the lo= cation
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithRea= dReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:316)
= at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(= ScannerCallableWithReplicas.java:156)
at org.apache.hadoop.hbase.= client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60= )
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithout= Retries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.cl= ient.ClientScanner.call(ClientScanner.java:320)
at org.apache.had= oop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstr= uction(ClientScanner.java:160)
at org.apache.hadoop.hbase.client.= ClientScanner.<init>(ClientScanner.java:155)
at org.apache.= hadoop.hbase.client.HTable.getScanner(HTable.java:867)
at org.apa= che.kylin.storage.hbase.HBaseResourceStore.visitFolder(HBaseResourceStore.j= ava:137)
at org.apache.kylin.storage.hbase.HBaseResourceStore.lis= tResourcesImpl(HBaseResourceStore.java:107)
at org.apache.kylin.c= ommon.persistence.ResourceStore.listResources(ResourceStore.java:123)
=
at org.apache.kylin.job.dao.ExecutableDao.getJobIds(ExecutableDao.java= :135)
... 9 more
Caused by: jav= a.net.ConnectException: Connection refused
at sun.nio.ch.SocketCh= annelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannel= Impl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hado= op.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
= at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
at o= rg.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
at org.a= pache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientIm= pl.java:416)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connect= ion.setupIOstreams(RpcClientImpl.java:722)
at org.apache.hadoop.h= base.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
<= /span>at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRe= quest(RpcClientImpl.java:873)
at org.apache.hadoop.hbase.ipc.RpcC= lientImpl.call(RpcClientImpl.java:1242)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.hadoop.hbas= e.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
<= /span>at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelIm= plementation.callBlockingMethod(AbstractRpcClient.java:331)
at or= g.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$Blockin= gStub.get(ClientProtos.java:34070)
at org.apache.hadoop.hbase.pro= tobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582)
at or= g.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.lo= cateRegionInMeta(ConnectionManager.java:1398)
at org.apache.hadoo= p.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(Con= nectionManager.java:1199)
at org.apache.hadoop.hbase.client.RpcRe= tryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadRe= plicas.java:305)
... 21 more
= 2016-07-12 08:10:03,901 WARN =C2=A0[pool-5-thread-1] threadpool.DefaultSche= duler:108 : Job Fetcher caught a exception java.lang.RuntimeException: org.= apache.kylin.job.exception.PersistentException: org.apache.hadoop.hbase.cli= ent.RetriesExhaustedException: Can't get the location
2016-07-12 08:10:11,955 INFO =C2=A0[http-bio-7070-exec-7= ] controller.QueryController:174 : Using project: CDR_Demo_Project
2016-07-12 08:10:11,958 INFO =C2=A0[http-bio-7070= -exec-7] controller.QueryController:175 : The original query: =C2=A0select = * from CDR_AGGREGATION
2016-07-12 08:10:1= 1,960 INFO =C2=A0[http-bio-7070-exec-7] service.QueryService:266 : The corr= ected query: select * from CDR_AGGREGATION
LIMIT 50000
2016-07-12 08:10:12,053 INF= O =C2=A0[http-bio-7070-exec-7] routing.QueryRouter:48 : The project manager= 's reference is org.apache.kylin.metadata.project.ProjectManager@5759a0= 69
2016-07-12 08:10:12,054 INFO =C2=A0[ht= tp-bio-7070-exec-7] routing.QueryRouter:60 : Find candidates by table DEFAU= LT.CDR_AGGREGATION and project=3DCDR_DEMO_PROJECT : org.apache.kylin.query.= routing.Candidate@44071cfe
2016-07-12 08:= 10:12,055 INFO =C2=A0[http-bio-7070-exec-7] routing.QueryRouter:49 : Applyi= ng rule: class org.apache.kylin.query.routing.rules.RemoveUncapableRealizat= ionsRule, realizations before: [CDR_Demo_Cube_1(CUBE)], realizations after:= [CDR_Demo_Cube_1(CUBE)]
2016-07-12 08:10= :12,055 INFO =C2=A0[http-bio-7070-exec-7] routing.QueryRouter:49 : Applying= rule: class org.apache.kylin.query.routing.rules.RealizationSortRule, real= izations before: [CDR_Demo_Cube_1(CUBE)], realizations after: [CDR_Demo_Cub= e_1(CUBE)]
2016-07-12 08:10:12,056 INFO = =C2=A0[http-bio-7070-exec-7] routing.QueryRouter:72 : The realizations rema= ining: [CDR_Demo_Cube_1(CUBE)] And the final chosen one is the first one
2016-07-12 08:10:12,077 DEBUG [http-bio-707= 0-exec-7] enumerator.OLAPEnumerator:107 : query storage...
2016-07-12 08:10:12,078 INFO =C2=A0[http-bio-7070-exec-7= ] enumerator.OLAPEnumerator:181 : No group by and aggregation found in this= query, will hack some result for better look of output...
2016-07-12 08:10:12,078 INFO =C2=A0[http-bio-7070-exec-7= ] v2.CubeStorageQuery:239 : exactAggregation is true
2016-07-12 08:10:12,079 INFO =C2=A0[http-bio-7070-exec-7] v2.Cu= beStorageQuery:357 : Enable limit 50000
2= 016-07-12 08:10:12,087 DEBUG [http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC= :257 : New scanner for current segment CDR_Demo_Cube_1[20150101000000_20160= 101000000] will use SCAN_FILTER_AGGR_CHECKMEM as endpoint's behavior
2016-07-12 08:10:12,088 DEBUG [http-bio-707= 0-exec-7] v2.CubeHBaseEndpointRPC:313 : Serialized scanRequestBytes 660 byt= es, rawScanBytesString 106 bytes
2016-07-= 12 08:10:12,088 INFO =C2=A0[http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC:3= 15 : The scan 416215cb for segment CDR_Demo_Cube_1[20150101000000_201601010= 00000] is as below with 1 separate raw scans, shard part of start/end key i= s set to 0
2016-07-12 08:10:12,090 INFO = =C2=A0[http-bio-7070-exec-7] v2.CubeHBaseRPC:271 : Visiting hbase table KYL= IN_HX9PP90NMQ: cuboid exact match, from 15 to 15 Start: \x00\x00\x00\x00\x0= 0\x00\x00\x00\x00\x0F\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x= 00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\= x00\x00 (\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0F\x00\x00\x00\x00\x00\x00\x= 00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\= x00\x00\x00\x00\x00\x00\x00\x00\x00) Stop: =C2=A0\x00\x00\x00\x00\x00\x00\x= 00\x00\x00\x0F\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\= xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF= \x00 (\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0F\xFF\xFF\xFF\xFF\xFF\xFF\xFF\= xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF= \xFF\xFF\xFF\xFF\xFF\xFF\xFF\xFF\x00), No Fuzzy Key
2016-07-12 08:10:12,090 DEBUG [http-bio-7070-exec-7] v2.CubeHBas= eEndpointRPC:320 : Submitting rpc to 2 shards starting from shard 0, scan r= ange count 1
2016-07-12 08:10:12,163 INFO= =C2=A0[http-bio-7070-exec-7] v2.CubeHBaseEndpointRPC:103 : Timeout for Exp= ectedSizeIterator is: 66000
2016-07-12 08= :10:12,164 DEBUG [http-bio-7070-exec-7] enumerator.OLAPEnumerator:127 : ret= urn TupleIterator...
2016-07-12 08:10:45,= 430 ERROR [pool-11-thread-5] util.LoggableCachedThreadPool:44 : Execution e= xception when running task in pool-11-thread-5
2016-07-12 08:10:45,432 ERROR [pool-11-thread-5] util.LoggableCachedT= hreadPool:54 : Caught exception in thread pool-11-thread-5:=C2=A0
java.lang.RuntimeException: <sub-thread for GTS= canRequest 416215cb> Error when visiting cubes by endpoint
at = org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC$1.run(CubeHBase= EndpointRPC.java:345)
at java.util.concurrent.Executors$RunnableA= dapter.call(Executors.java:511)
at java.util.concurrent.FutureTas= k.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExec= utor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurr= ent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at= java.lang.Thread.run(Thread.java:745)
Ca= used by: java.net.ConnectException: Connection refused
at sun.nio= .ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.S= ocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at or= g.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:20= 6)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
=
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnectio= n(RpcClientImpl.java:416)
at org.apache.hadoop.hbase.ipc.RpcClien= tImpl$Connection.setupIOstreams(RpcClientImpl.java:722)
at org.ap= ache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.j= ava:906)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.= tracedWriteRequest(RpcClientImpl.java:873)
at org.apache.hadoop.h= base.ipc.RpcClientImpl.call(RpcClientImpl.java:1242)
at org.apach= e.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.j= ava:226)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$Blockin= gRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
= at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientSe= rvice$BlockingStub.get(ClientProtos.java:34070)
at org.apache.had= oop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1582)
<= /span>at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplem= entation.locateRegionInMeta(ConnectionManager.java:1398)
at org.a= pache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locat= eRegion(ConnectionManager.java:1199)
at org.apache.hadoop.hbase.c= lient.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionMa= nager.java:1179)
at org.apache.hadoop.hbase.client.ConnectionMa= nager$HConnectionImplementation.locateRegion(ConnectionManager.java:1136)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImp= lementation.getRegionLocation(ConnectionManager.java:971)
at or= g.apache.hadoop.hbase.client.HRegionLocator.getRegionLocation(HRegionLocato= r.java:83)
at org.apache.hadoop.hbase.client.HTable.getRegionLoca= tion(HTable.java:569)
at org.apache.hadoop.hbase.client.HTable.ge= tKeysAndRegionsInRange(HTable.java:793)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.hadoop.hbas= e.client.HTable.getKeysAndRegionsInRange(HTable.java:763)
at or= g.apache.hadoop.hbase.client.HTable.getStartKeysInRange(HTable.java:1830)
at org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable= .java:1785)
at org.apache.hadoop.hbase.client.HTable.coprocessorS= ervice(HTable.java:1765)
at org.apache.kylin.storage.hbase.cube.v= 2.CubeHBaseEndpointRPC.getResults(CubeHBaseEndpointRPC.java:389)
= at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC.access$200(C= ubeHBaseEndpointRPC.java:75)
at org.apache.kylin.storage.hbase.c= ube.v2.CubeHBaseEndpointRPC$1.run(CubeHBaseEndpointRPC.java:343)
= ... 5 more
2016-07-12 08:11:03,839 ERROR = [pool-5-thread-1] dao.ExecutableDao:145 : error get all Jobs:
org.apache.hadoop.hbase.client.RetriesExhaustedExcepti= on: Can't get the location
at org.apache.hadoop.hbase.client.= RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithR= eadReplicas.java:316)
at org.apache.hadoop.hbase.client.ScannerCa= llableWithReplicas.call(ScannerCallableWithReplicas.java:156)
at = org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCall= ableWithReplicas.java:60)
at org.apache.hadoop.hbase.client.RpcRe= tryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at or= g.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
= at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientSc= anner.java:295)
at org.apache.hadoop.hbase.client.ClientScanner.i= nitializeScannerInConstruction(ClientScanner.java:160)
at org.apa= che.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)<= /div>
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:8= 67)
at org.apache.kylin.storage.hbase.HBaseResourceStore.visitFol= der(HBaseResourceStore.java:137)
at org.apache.kylin.storage.hbas= e.HBaseResourceStore.listResourcesImpl(HBaseResourceStore.java:107)
at org.apache.kylin.common.persistence.ResourceStore.listResources(Resou= rceStore.java:123)
at org.apache.kylin.job.dao.ExecutableDao.getJ= obIds(ExecutableDao.java:135)
at org.apache.kylin.job.manager.Exe= cutableManager.getAllJobIds(ExecutableManager.java:204)
at org.ap= ache.kylin.job.impl.threadpool.DefaultScheduler$FetcherRunner.run(DefaultSc= heduler.java:81)
at java.util.concurrent.Executors$RunnableAdap= ter.call(Executors.java:511)
at java.util.concurrent.FutureTask.= runAndReset(FutureTask.java:308)
at java.util.concurrent.Schedule= dThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecu= tor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor= $ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at= java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:= 1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Threa= dPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)<= /div>
Caused by: java.net.ConnectException: Con= nection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Nati= ve Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketCh= annelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.= connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.Ne= tUtils.connect(NetUtils.java:530)
at org.apache.hadoop.net.NetUti= ls.connect(NetUtils.java:494)
at org.apache.hadoop.hbase.ipc.RpcC= lientImpl$Connection.setupConnection(RpcClientImpl.java:416)
at o= rg.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClien= tImpl.java:722)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Conn= ection.writeRequest(RpcClientImpl.java:906)
at org.apache.hadoop.= hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:87= 3)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImp= l.java:1242)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.cal= lBlockingMethod(AbstractRpcClient.java:226)
at org.apache.hadoop.= hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe= thod(AbstractRpcClient.java:331)
at org.apache.hadoop.hbase.proto= buf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java= :34070)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOr= Before(ProtobufUtil.java:1582)
at org.apache.hadoop.hbase.client.= ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionMa= nager.java:1398)
at org.apache.hadoop.hbase.client.ConnectionMa= nager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199)
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplica= s.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
= ... 21 more
2016-07-12 08:11:03,842 ERROR= [pool-5-thread-1] manager.ExecutableManager:206 : error get All Job Ids
org.apache.kylin.job.exception.PersistentEx= ception: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can'= t get the location
at org.apache.kylin.job.dao.ExecutableDao.getJ= obIds(ExecutableDao.java:146)
at org.apache.kylin.job.manager.Exe= cutableManager.getAllJobIds(ExecutableManager.java:204)
at org.ap= ache.kylin.job.impl.threadpool.DefaultScheduler$FetcherRunner.run(DefaultSc= heduler.java:81)
at java.util.concurrent.Executors$RunnableAdap= ter.call(Executors.java:511)
at java.util.concurrent.FutureTask.= runAndReset(FutureTask.java:308)
at java.util.concurrent.Schedule= dThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecu= tor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor= $ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at= java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:= 1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Threa= dPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)<= /div>
Caused by: org.apache.hadoop.hbase.client= .RetriesExhaustedException: Can't get the location
at org.apa= che.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocation= s(RpcRetryingCallerWithReadReplicas.java:316)
at org.apache.hadoo= p.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas= .java:156)
at org.apache.hadoop.hbase.client.ScannerCallableWithR= eplicas.call(ScannerCallableWithReplicas.java:60)
at org.apache.h= adoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.j= ava:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(Cli= entScanner.java:320)
at org.apache.hadoop.hbase.client.ClientScan= ner.nextScanner(ClientScanner.java:295)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.hadoop.hbas= e.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:1= 60)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(= ClientScanner.java:155)
at org.apache.hadoop.hbase.client.HTable.= getScanner(HTable.java:867)
at org.apache.kylin.storage.hbase.H= BaseResourceStore.visitFolder(HBaseResourceStore.java:137)
at or= g.apache.kylin.storage.hbase.HBaseResourceStore.listResourcesImpl(HBaseReso= urceStore.java:107)
at org.apache.kylin.common.persistence.Resour= ceStore.listResources(ResourceStore.java:123)
at org.apache.kylin= .job.dao.ExecutableDao.getJobIds(ExecutableDao.java:135)
... 9 mo= re
Caused by: java.net.ConnectException: = Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(N= ative Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Socke= tChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeo= ut.connect(SocketIOWithTimeout.java:206)
= at org.apache.hadoop.net= .NetUtils.connect(NetUtils.java:530)
at org.apache.hadoop.net.Net= Utils.connect(NetUtils.java:494)
at org.apache.hadoop.hbase.ipc.R= pcClientImpl$Connection.setupConnection(RpcClientImpl.java:416)
a= t org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcCl= ientImpl.java:722)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$C= onnection.writeRequest(RpcClientImpl.java:906)
at org.apache.hado= op.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java= :873)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClient= Impl.java:1242)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.= callBlockingMethod(AbstractRpcClient.java:226)
at org.apache.hado= op.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockin= gMethod(AbstractRpcClient.java:331)
at org.apache.hadoop.hbase.pr= otobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.j= ava:34070)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRo= wOrBefore(ProtobufUtil.java:1582)
at org.apache.hadoop.hbase.clie= nt.ConnectionManager$HConnectionImplementation.locateRegionInMeta(Connectio= nManager.java:1398)
at org.apache.hadoop.hbase.client.ConnectionM= anager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199)<= /div>
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplic= as.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
... 21 more
2016-07-12 08:11:03,843 WARN= =C2=A0[pool-5-thread-1] threadpool.DefaultScheduler:108 : Job Fetcher caug= ht a exception java.lang.RuntimeException: org.apache.kylin.job.exception.P= ersistentException: org.apache.hadoop.hbase.client.RetriesExhaustedExceptio= n: Can't get the location
2016-07-12 = 08:11:18,165 ERROR [http-bio-7070-exec-7] controller.QueryController:209 : = Exception when execute sql
java.sql.SQLEx= ception: Error while executing SQL "select * from CDR_AGGREGATION
LIMIT 50000": Timeout visiting cube!
= at org.apache.calcite.avatica.Helper.createException(Helper.java:56)=
at org.apache.calcite.avatica.Helper.createException(Helper.java= :41)
at org.apache.calcite.avatica.AvaticaStatement.executeIntern= al(AvaticaStatement.java:143)
at org.apache.calcite.avatica.Avati= caStatement.executeQuery(AvaticaStatement.java:186)
at org.apache= .kylin.rest.service.QueryService.execute(QueryService.java:361)
a= t org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryServi= ce.java:273)
at org.apache.kylin.rest.service.QueryService.query(= QueryService.java:121)
at org.apache.kylin.rest.service.QueryServ= ice$$FastClassByCGLIB$$4957273f.invoke(<generated>)
at ne= t.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at or= g.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.in= tercept(Cglib2AopProxy.java:618)
at org.apache.kylin.rest.service= .QueryService$$EnhancerByCGLIB$$2b43fc30.query(<generated>)
at org.apache.kylin.rest.controller.QueryController.doQueryWithCache(Query= Controller.java:192)
at org.apache.kylin.rest.controller.QueryCon= troller.query(QueryController.java:94)
at sun.reflect.NativeMetho= dAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodA= ccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.refle= ct.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43= )
at java.lang.reflect.Method.invoke(Method.java:497)
a= t org.springframework.web.method.support.InvocableHandlerMethod.invoke(Invo= cableHandlerMethod.java:213)
at org.springframework.web.method.s= upport.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:= 126)
at org.springframework.web.servlet.mvc.method.annotation.Ser= vletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.ja= va:96)
at org.springframework.web.servlet.mvc.method.annotation.R= equestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapte= r.java:617)
at org.springframework.web.servlet.mvc.method.annotat= ion.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapte= r.java:578)
at org.springframework.web.servlet.mvc.method.Abstrac= tHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(Dispatche= rServlet.java:923)
at org.springframework.web.servlet.DispatcherS= ervlet.doService(DispatcherServlet.java:852)
at org.springframewo= rk.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882)
at org.springframework.web.servlet.FrameworkServlet.doPost(Framewo= rkServlet.java:789)
at javax.servlet.http.HttpServlet.service(Htt= pServlet.java:646)
at javax.servlet.http.HttpServlet.service(Http= Servlet.java:727)
at org.apache.catalina.core.ApplicationFilterC= hain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apa= che.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.ja= va:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(W= sFilter.java:52)
at org.apache.catalina.core.ApplicationFilterC= hain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apa= che.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.ja= va:208)
at org.springframework.security.web.FilterChainProxy$Virt= ualFilterChain.doFilter(FilterChainProxy.java:330)
at org.springf= ramework.security.web.access.intercept.FilterSecurityInterceptor.invoke(Fil= terSecurityInterceptor.java:118)
at org.springframework.security.= web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInter= ceptor.java:84)
at org.springframework.security.web.FilterChainPr= oxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at or= g.springframework.security.web.access.ExceptionTranslationFilter.doFilter(E= xceptionTranslationFilter.java:113)
at org.springframework.securi= ty.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:3= 42)
at org.springframework.security.web.session.SessionManagement= Filter.doFilter(SessionManagementFilter.java:103)
at org.springfr= amework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterCha= inProxy.java:342)
at org.springframework.security.web.authentica= tion.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.j= ava:113)
at org.springframework.security.web.FilterChainProxy$Vir= tualFilterChain.doFilter(FilterChainProxy.java:342)
at org.spring= framework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.d= oFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.s= pringframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(Fi= lterChainProxy.java:342)
at org.springframework.security.web.save= drequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)<= /div>
at org.springframework.security.web.FilterChainProxy$VirtualFilte= rChain.doFilter(FilterChainProxy.java:342)
at org.springframework= .security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAu= thenticationFilter.java:201)
at org.springframework.security.web= .FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
= at org.springframework.security.web.authentication.ui.DefaultLoginP= ageGeneratingFilter.doFilter(DefaultLoginPageGeneratingFilter.java:91)
<= /span>at org.springframework.security.web.FilterChainProxy$VirtualFilterCha= in.doFilter(FilterChainProxy.java:342)
at org.springframework.sec= urity.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(Ab= stractAuthenticationProcessingFilter.java:183)
at org.springframe= work.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainP= roxy.java:342)
at org.springframework.security.web.authentication= .logout.LogoutFilter.doFilter(LogoutFilter.java:105)
at org.sprin= gframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(Filter= ChainProxy.java:342)
at org.springframework.security.web.context.= SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.= java:87)
at org.springframework.security.web.FilterChainProxy$Vir= tualFilterChain.doFilter(FilterChainProxy.java:342)
at org.spring= framework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.j= ava:192)
at org.springframework.security.web.FilterChainProxy.doF= ilter(FilterChainProxy.java:160)
at org.springframework.web.filte= r.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
<= /span>at org.springframework.web.filter.DelegatingFilterProxy.doFilter(Dele= gatingFilterProxy.java:259)
at org.apache.catalina.core.Applica= tionFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFil= terChain.java:208)
at com.thetransactioncompany.cors.CORSFilter.d= oFilter(CORSFilter.java:195)
at com.thetransactioncompany.cors.C= ORSFilter.doFilter(CORSFilter.java:266)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.catalina.co= re.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)=
at org.apache.catalina.core.ApplicationFilterChain.doFilter(Appl= icationFilterChain.java:208)
at org.apache.catalina.core.Standar= dWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache= .catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(Auth= enticatorBase.java:504)
at org.apache.catalina.core.StandardHostV= alve.invoke(StandardHostValve.java:170)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.catalina.va= lves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.ap= ache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineVal= ve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.servi= ce(CoyoteAdapter.java:421)
at org.apache.coyote.http11.AbstractHt= tp11Processor.process(AbstractHttp11Processor.java:1074)
at org.a= pache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractPro= tocol.java:611)
at org.apache.tomcat.util.net.JIoEndpoint$SocketP= rocessor.run(JIoEndpoint.java:314)
at java.util.concurrent.Thread= PoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util= .concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
<= /span>at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(Tas= kThread.java:61)
at java.lang.Thread.run(Thread.java:745)
=
Caused by: java.lang.RuntimeException: Timeout= visiting cube!
at org.apache.kylin.storage.hbase.cube.v2.CubeHBa= seEndpointRPC$ExpectedSizeIterator.next(CubeHBaseEndpointRPC.java:127)
<= /span>at org.apache.kylin.storage.hbase.cube.v2.CubeHBaseEndpointRPC$Expect= edSizeIterator.next(CubeHBaseEndpointRPC.java:81)
at com.google.c= ommon.collect.TransformedIterator.next(TransformedIterator.java:48)
at com.google.common.collect.Iterators$6.hasNext(Iterators.java:583)
= at org.apache.kylin.storage.hbase.cube.v2.SequentialCubeTupleIterato= r.hasNext(SequentialCubeTupleIterator.java:96)
at org.apache.kyli= n.query.enumerator.OLAPEnumerator.moveNext(OLAPEnumerator.java:74)
at org.apache.calcite.linq4j.EnumerableDefaults$TakeWhileEnumerator.moveN= ext(EnumerableDefaults.java:2818)
at org.apache.calcite.linq4j.Li= nq4j$EnumeratorIterator.<init>(Linq4j.java:664)
at org.apac= he.calcite.linq4j.Linq4j.enumeratorIterator(Linq4j.java:98)
at or= g.apache.calcite.linq4j.AbstractEnumerable.iterator(AbstractEnumerable.java= :33)
at org.apache.calcite.avatica.MetaImpl.createCursor(MetaImpl= .java:85)
at org.apache.calcite.avatica.AvaticaResultSet.execute(= AvaticaResultSet.java:190)
at org.apache.calcite.jdbc.CalciteResu= ltSet.execute(CalciteResultSet.java:65)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.calcite.jdb= c.CalciteResultSet.execute(CalciteResultSet.java:44)
at org.apach= e.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:566)
at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(Calci= teMetaImpl.java:578)
at org.apache.calcite.avatica.AvaticaConnect= ion.prepareAndExecuteInternal(AvaticaConnection.java:571)
at or= g.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.= java:135)
... 80 more
2016-07-1= 2 08:11:18,173 INFO =C2=A0[http-bio-7070-exec-7] service.QueryService:250 := =C2=A0
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D[QUERY]=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
SQL: select * from CDR_AGGREGATION
User: ADMIN
Suc= cess: false
Duration: 0.0
Project: CDR_Demo_Project
Realization Names: [CDR_Demo_Cube_1]
= Cuboid Ids: [15]
Total scan count: 0
Result row count: 0
Accept Partial: true
Is Partial = Result: false
Hit Exception Cache: false<= /div>
Storage cache used: false
Message: Error while executing SQL "select * from C= DR_AGGREGATION LIMIT 50000": Timeout visiting cube!
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D[QUERY]=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D

2016-07-12 08:11:18,174 ERRO= R [http-bio-7070-exec-7] controller.BasicController:44 :=C2=A0
org.apache.kylin.rest.exception.InternalErrorExceptio= n: Error while executing SQL "select * from CDR_AGGREGATION LIMIT 5000= 0": Timeout visiting cube!
at org.apache.kylin.rest.controll= er.QueryController.doQueryWithCache(QueryController.java:224)
at = org.apache.kylin.rest.controller.QueryController.query(QueryController.java= :94)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Metho= d)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcc= essorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.in= voke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.M= ethod.invoke(Method.java:497)
at org.springframework.web.method.s= upport.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213)
=
at org.springframework.web.method.support.InvocableHandlerMethod.invok= eForRequest(InvocableHandlerMethod.java:126)
at org.springframewo= rk.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAn= dHandle(ServletInvocableHandlerMethod.java:96)
at org.springframe= work.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeH= andlerMethod(RequestMappingHandlerAdapter.java:617)
at org.spring= framework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.ha= ndleInternal(RequestMappingHandlerAdapter.java:578)
at org.spring= framework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(Abstra= ctHandlerMethodAdapter.java:80)
at org.springframework.web.servle= t.DispatcherServlet.doDispatch(DispatcherServlet.java:923)
at or= g.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet= .java:852)
at org.springframework.web.servlet.FrameworkServlet.pr= ocessRequest(FrameworkServlet.java:882)
<= span class=3D"" style=3D"white-space:pre"> at org.springframework.we= b.servlet.FrameworkServlet.doPost(FrameworkServlet.java:789)
at j= avax.servlet.http.HttpServlet.service(HttpServlet.java:646)
at ja= vax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at or= g.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationF= ilterChain.java:303)
at org.apache.catalina.core.ApplicationFilte= rChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.to= mcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at or= g.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationF= ilterChain.java:241)
at org.apache.catalina.core.ApplicationFilte= rChain.doFilter(ApplicationFilterChain.java:208)
at org.springfra= mework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChai= nProxy.java:330)
at org.springframework.security.web.access.int= ercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)=
at org.springframework.security.web.access.intercept.FilterSecur= ityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at or= g.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter= (FilterChainProxy.java:342)
at org.springframework.security.web= .access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java= :113)
at org.springframework.security.web.FilterChainProxy$Virtua= lFilterChain.doFilter(FilterChainProxy.java:342)
at org.springfra= mework.security.web.session.SessionManagementFilter.doFilter(SessionManagem= entFilter.java:103)
at org.springframework.security.web.FilterCha= inProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at= org.springframework.security.web.authentication.AnonymousAuthenticationFil= ter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.sprin= gframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(Filter= ChainProxy.java:342)
at org.springframework.security.web.servleta= pi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAw= areRequestFilter.java:54)
at org.springframework.security.web.Fil= terChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
<= div class=3D"gmail_default"> at org.springframework.security.web.savedrequest.RequestCacheAwareFilte= r.doFilter(RequestCacheAwareFilter.java:45)
at org.springframewor= k.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProx= y.java:342)
at org.springframework.security.web.authentication.ww= w.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:201)
= at org.springframework.security.web.FilterChainProxy$VirtualFilterC= hain.doFilter(FilterChainProxy.java:342)
= at org.springframework.s= ecurity.web.authentication.ui.DefaultLoginPageGeneratingFilter.doFilter(Def= aultLoginPageGeneratingFilter.java:91)
at org.springframework.sec= urity.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.jav= a:342)
at org.springframework.security.web.authentication.Abstrac= tAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFi= lter.java:183)
at org.springframework.security.web.FilterChainPro= xy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at or= g.springframework.security.web.authentication.logout.LogoutFilter.doFilter(= LogoutFilter.java:105)
at org.springframework.security.web.Filter= ChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilt= er.doFilter(SecurityContextPersistenceFilter.java:87)
at org.spri= ngframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(Filte= rChainProxy.java:342)
at org.springframework.security.web.FilterC= hainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.spri= ngframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:16= 0)
at org.springframework.web.filter.DelegatingFilterProxy.invoke= Delegate(DelegatingFilterProxy.java:346)
= at org.springframework.w= eb.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259)
= at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter= (ApplicationFilterChain.java:241)
at org.apache.catalina.core.App= licationFilterChain.doFilter(ApplicationFilterChain.java:208)
at = com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:195)
= at com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.jav= a:266)
at org.apache.catalina.core.ApplicationFilterChain.interna= lDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina= .core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
<= /span>at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapp= erValve.java:220)
at org.apache.catalina.core.StandardContextVal= ve.invoke(StandardContextValve.java:122)
= at org.apache.catalina.a= uthenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:504)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.jav= a:170)
at org.apache.catalina.valves.ErrorReportValve.invoke(Erro= rReportValve.java:103)
at org.apache.catalina.valves.AccessLogVal= ve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.St= andardEngineValve.invoke(StandardEngineValve.java:116)
at org.apa= che.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:421)
<= div class=3D"gmail_default"> at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHtt= p11Processor.java:1074)
at org.apache.coyote.AbstractProtocol$Abs= tractConnectionHandler.process(AbstractProtocol.java:611)
at or= g.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:3= 14)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPo= olExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$= Worker.run(ThreadPoolExecutor.java:617)
<= span class=3D"" style=3D"white-space:pre"> at org.apache.tomcat.util= .threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at j= ava.lang.Thread.run(Thread.java:745)


<= /font>

--001a1147c9ae0fe25d0537721935--