kylin-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From yuzhang <shifengdefan...@163.com>
Subject Re: Altlas Error when run IT on sandbox2.4
Date Sun, 05 May 2019 11:50:01 GMT
Well, stop atlas process and remove `org.apache.atlas.hive.hook.HiveHook`  in Hive configuration
in ambari can solve this problem. Atlas process is not necessary for runing Integration Test.


| |
yuzhang
|
|
shifengdefannao@163.com
|
签名由网易邮箱大师定制
On 5/5/2019 09:01,yuzhang<shifengdefannao@163.com> wrote:


And I find the following exception message in Atlas log file.


2019-05-05 00:40:06,346 DEBUG - [qtp1798286609-13 - 1a863767-d092-4b8d-a45a-d8cb82d8e6ae:]
~ submitting entity {
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"id":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"id":"-39494778537956",
"version":0,
"typeName":"hive_table"
},
"typeName":"hive_table",
"values":{
"tableType":"MANAGED_TABLE",
"name":"default.kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by@Sandbox",
"createTime":"1557016193",
"temporary":false,
"db":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"id":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"id":"409729d5-c11f-482e-b211-3f50bd097b8e",
"version":0,
"typeName":"hive_db"
},
"typeName":"hive_db",
"values":{

},
"traitNames":[

],
"traits":{

}
},
"retention":0,
"tableName":"kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by",
"columns":[
{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"id":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"id":"bb04e660-6be2-42a0-8ad2-1b36487e24b0",
"version":0,
"typeName":"hive_column"
},
"typeName":"hive_column",
"values":{

},
"traitNames":[

],
"traits":{

}
}
],
"comment":"",
"lastAccessTime":0,
"owner":"root",
"sd":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"id":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"id":"b71fce9e-4e47-4af3-8c51-d8f93a45ebe4",
"version":0,
"typeName":"hive_storagedesc"
},
"typeName":"hive_storagedesc",
"values":{

},
"traitNames":[

],
"traits":{

}
},
"parameters":{
"comment":"",
"transient_lastDdlTime":"1557016193"
},
"partitionKeys":[
{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Reference",
"id":{
"jsonClass":"org.apache.atlas.typesystem.json.InstanceSerialization$_Id",
"id":"81cd90bf-7f7f-4951-9d68-3273337573d3",
"version":0,
"typeName":"hive_column"
},
"typeName":"hive_column",
"values":{

},
"traitNames":[

],
"traits":{

}
}
]
},
"traitNames":[

],
"traits":{

}
}  (EntityResource:94)
2019-05-05 00:40:06,349 ERROR - [qtp1798286609-13 - 1a863767-d092-4b8d-a45a-d8cb82d8e6ae:]
~ Unable to persist entity instance due to a desrialization error  (EntityResource:109)
org.apache.atlas.typesystem.types.ValueConversionException: Cannot convert value 'org.apache.atlas.typesystem.Referenceable@2f651f62'
to datatype hive_table
at org.apache.atlas.typesystem.types.ClassType.convert(ClassType.java:143)
at org.apache.atlas.services.DefaultMetadataService.deserializeClassInstance(DefaultMetadataService.java:252)
at org.apache.atlas.services.DefaultMetadataService.createEntity(DefaultMetadataService.java:230)
at org.apache.atlas.web.resources.EntityResource.submit(EntityResource.java:96)
at sun.reflect.GeneratedMethodAccessor70.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:287)
at com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:277)
at com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:182)
at com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91)
at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)
at org.apache.atlas.web.filters.AuditFilter.doFilter(AuditFilter.java:67)
at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:82)
at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)
at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)
at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)
at com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)
at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.atlas.typesystem.types.ValueConversionException$NullConversionException:
Null value not allowed for multiplicty Multiplicity{lower=1, upper=1, isUnique=false}
at org.apache.atlas.typesystem.types.DataTypes$PrimitiveType.convertNull(DataTypes.java:93)
at org.apache.atlas.typesystem.types.DataTypes$StringType.convert(DataTypes.java:469)
at org.apache.atlas.typesystem.types.DataTypes$StringType.convert(DataTypes.java:452)
at org.apache.atlas.typesystem.types.DataTypes$MapType.convert(DataTypes.java:606)
at org.apache.atlas.typesystem.types.DataTypes$MapType.convert(DataTypes.java:562)
at org.apache.atlas.typesystem.persistence.StructInstance.set(StructInstance.java:118)
at org.apache.atlas.typesystem.types.ClassType.convert(ClassType.java:141)
... 51 more
| |
yuzhang
|
|
shifengdefannao@163.com
|
签名由网易邮箱大师定制
On 5/5/2019 08:47,yuzhang<shifengdefannao@163.com> wrote:
Hi dear all:
I meet this exception when I run IT on sandbox2.4 or run the hive cmd on shell. Does any people
meet the same problem?
Here is a discussion about this problem but I think it isn't helpful. https://community.hortonworks.com/questions/224847/i-am-getting-errornull-value-not-allowed-for-multi.html
Best regards
yuzhang




==Hive cmd:==========
hive -e "USE default;


CREATE TABLE IF NOT EXISTS default.ci_inner_join_cube_global_dict
( dict_key STRING COMMENT '',
dict_val INT COMMENT ''
)
COMMENT ''
PARTITIONED BY (dict_column string)
STORED AS TEXTFILE;
DROP TABLE IF EXISTS kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by;
CREATE TABLE IF NOT EXISTS kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by
(
dict_key STRING COMMENT ''
)
COMMENT ''
PARTITIONED BY (dict_column string)
STORED AS SEQUENCEFILE
;
INSERT OVERWRITE TABLE kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by
PARTITION (dict_column = 'TEST_KYLIN_FACT_TEST_COUNT_DISTINCT_BITMAP')
SELECT
TEST_KYLIN_FACT_TEST_COUNT_DISTINCT_BITMAP
FROM kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd
GROUP BY TEST_KYLIN_FACT_TEST_COUNT_DISTINCT_BITMAP
;


" --hiveconf mapreduce.map.output.compress.codec=org.apache.hadoop.io.compress.SnappyCodec
--hiveconf dfs.replication=2 --hiveconf hive.exec.compress.output=true


================  


=======Error========
org.apache.atlas.AtlasServiceException: Metadata service API CREATE_ENTITY failed with status
400(Bad Request) Response Body ({"error":"Null value not allowed for multiplicty Multiplicity{lower=1,
upper=1, isUnique=false}","stackTrace":"org.apache.atlas.typesystem.types.ValueConversionException$NullConversionException:
Null value not allowed for multiplicty Multiplicity{lower=1, upper=1, isUnique=false}\n\tat
org.apache.atlas.typesystem.types.DataTypes$PrimitiveType.convertNull(DataTypes.java:93)\n\tat
org.apache.atlas.typesystem.types.DataTypes$StringType.convert(DataTypes.java:469)\n\tat org.apache.atlas.typesystem.types.DataTypes$StringType.convert(DataTypes.java:452)\n\tat
org.apache.atlas.typesystem.types.DataTypes$MapType.convert(DataTypes.java:606)\n\tat org.apache.atlas.typesystem.types.DataTypes$MapType.convert(DataTypes.java:562)\n\tat
org.apache.atlas.typesystem.persistence.StructInstance.set(StructInstance.java:118)\n\tat
org.apache.atlas.typesystem.types.ClassType.convert(ClassType.java:141)\n\tat org.apache.atlas.services.DefaultMetadataService.deserializeClassInstance(DefaultMetadataService.java:252)\n\tat
org.apache.atlas.services.DefaultMetadataService.createEntity(DefaultMetadataService.java:230)\n\tat
org.apache.atlas.web.resources.EntityResource.submit(EntityResource.java:96)\n\tat sun.reflect.GeneratedMethodAccessor70.invoke(Unknown
Source)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat
java.lang.reflect.Method.invoke(Method.java:498)\n\tat com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)\n\tat
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)\n\tat
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)\n\tat
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)\n\tat
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)\n\tat
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)\n\tat
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)\n\tat
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)\n\tat
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)\n\tat
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)\n\tat
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)\n\tat
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)\n\tat com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)\n\tat
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)\n\tat
javax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n\tat com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:287)\n\tat
com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:277)\n\tat com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:182)\n\tat
com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91)\n\tat
com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)\n\tat
org.apache.atlas.web.filters.AuditFilter.doFilter(AuditFilter.java:67)\n\tat com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:82)\n\tat
com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)\n\tat
com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)\n\tat com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)\n\tat
com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)\n\tat com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)\n\tat
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n\tat
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)\n\tat
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)\n\tat
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat
org.eclipse.jetty.server.Server.handle(Server.java:499)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)\n\tat
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)\n\tat org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)\n\tat
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)\n\tat
java.lang.Thread.run(Thread.java:748)\n"})
at org.apache.atlas.AtlasClient.callAPIWithResource(AtlasClient.java:365)
at org.apache.atlas.AtlasClient.callAPI(AtlasClient.java:370)
at org.apache.atlas.AtlasClient.createEntity(AtlasClient.java:210)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.createInstance(HiveMetaStoreBridge.java:132)
at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerTable(HiveMetaStoreBridge.java:323)
at org.apache.atlas.hive.hook.HiveHook.handleCreateTable(HiveHook.java:271)
at org.apache.atlas.hive.hook.HiveHook.fireAndForget(HiveHook.java:205)
at org.apache.atlas.hive.hook.HiveHook.run(HiveHook.java:172)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1585)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1254)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1118)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1108)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:314)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:711)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)


The command is:
hive -e "USE default;


CREATE TABLE IF NOT EXISTS default.ci_inner_join_cube_global_dict
( dict_key STRING COMMENT '',
dict_val INT COMMENT ''
)
COMMENT ''
PARTITIONED BY (dict_column string)
STORED AS TEXTFILE;
DROP TABLE IF EXISTS kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by;
CREATE TABLE IF NOT EXISTS kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by
(
dict_key STRING COMMENT ''
)
COMMENT ''
PARTITIONED BY (dict_column string)
STORED AS SEQUENCEFILE
;
INSERT OVERWRITE TABLE kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd__group_by
PARTITION (dict_column = 'TEST_KYLIN_FACT_TEST_COUNT_DISTINCT_BITMAP')
SELECT
TEST_KYLIN_FACT_TEST_COUNT_DISTINCT_BITMAP
FROM kylin_intermediate_ci_inner_join_cube_325139ef_5dd0_01b4_ae61_bc4dcc99c2bd
GROUP BY TEST_KYLIN_FACT_TEST_COUNT_DISTINCT_BITMAP
;


" --hiveconf mapreduce.map.output.compress.codec=org.apache.hadoop.io.compress.SnappyCodec
--hiveconf dfs.replication=2 --hiveconf hive.exec.compress.output=true
at org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:96)
at org.apache.kylin.source.hive.CreateMrHiveDictStep.createMrHiveDict(CreateMrHiveDictStep.java:127)
at org.apache.kylin.source.hive.CreateMrHiveDictStep.doWork(CreateMrHiveDictStep.java:168)
at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:179)
at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71)
at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:179)
at org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:114)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2019-05-04 17:39:17,968 INFO  [Scheduler 1795467682 Job 54e197e6-9b98-e8f6-ce38-e8c99e3538d6-298]
execution.ExecutableManager:471 : job id:54e197e6-9b98-e8f6-ce38-e8c99e3538d6-01 from RUNNING
to ERROR
2019-05-04 17:39:17,974 DEBUG [pool-4-thread-1] cachesync.Broadcaster:116 : Servers in the
cluster: [localhost:7070]
2019-05-04 17:39:17,975 DEBUG [pool-4-thread-1] cachesync.Broadcaster:126 : Announcing new
broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}
2019-05-04 17:39:17,976 ERROR [pool-5-thread-1] cachesync.Broadcaster:140 : Announce broadcast
event failed, targetNode localhost:7070 broadcastEvent BroadcastEvent{entity=execute_output,
event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}, error msg: org.apache.http.conn.HttpHostConnectException:
Connection to http://localhost:7070 refused
2019-05-04 17:39:17,976 ERROR [pool-5-thread-1] cachesync.Broadcaster:329 : Announce broadcast
event exceeds retry limit, abandon targetNode localhost:7070 broadcastEvent BroadcastEvent{entity=execute_output,
event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}
2019-05-04 17:39:17,980 DEBUG [pool-4-thread-1] cachesync.Broadcaster:116 : Servers in the
cluster: [localhost:7070]
2019-05-04 17:39:17,980 DEBUG [pool-4-thread-1] cachesync.Broadcaster:126 : Announcing new
broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}
2019-05-04 17:39:17,981 ERROR [pool-5-thread-1] cachesync.Broadcaster:140 : Announce broadcast
event failed, targetNode localhost:7070 broadcastEvent BroadcastEvent{entity=execute_output,
event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}, error msg: org.apache.http.conn.HttpHostConnectException:
Connection to http://localhost:7070 refused
2019-05-04 17:39:17,981 ERROR [pool-5-thread-1] cachesync.Broadcaster:329 : Announce broadcast
event exceeds retry limit, abandon targetNode localhost:7070 broadcastEvent BroadcastEvent{entity=execute_output,
event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}
2019-05-04 17:39:17,984 INFO  [Scheduler 1795467682 Job 54e197e6-9b98-e8f6-ce38-e8c99e3538d6-298]
execution.ExecutableManager:471 : job id:54e197e6-9b98-e8f6-ce38-e8c99e3538d6 from RUNNING
to ERROR
2019-05-04 17:39:17,984 DEBUG [Scheduler 1795467682 Job 54e197e6-9b98-e8f6-ce38-e8c99e3538d6-298]
execution.AbstractExecutable:332 : no need to send email, user list is empty
2019-05-04 17:39:17,985 DEBUG [pool-4-thread-1] cachesync.Broadcaster:116 : Servers in the
cluster: [localhost:7070]
2019-05-04 17:39:17,985 DEBUG [pool-4-thread-1] cachesync.Broadcaster:126 : Announcing new
broadcast to all: BroadcastEvent{entity=execute_output, event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}
2019-05-04 17:39:17,986 ERROR [pool-5-thread-1] cachesync.Broadcaster:140 : Announce broadcast
event failed, targetNode localhost:7070 broadcastEvent BroadcastEvent{entity=execute_output,
event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}, error msg: org.apache.http.conn.HttpHostConnectException:
Connection to http://localhost:7070 refused
2019-05-04 17:39:17,986 ERROR [pool-5-thread-1] cachesync.Broadcaster:329 : Announce broadcast
event exceeds retry limit, abandon targetNode localhost:7070 broadcastEvent BroadcastEvent{entity=execute_output,
event=update, cacheKey=54e197e6-9b98-e8f6-ce38-e8c99e3538d6}
2019-05-04 17:39:17,993 INFO  [FetcherRunner 1850834078-290] threadpool.DefaultFetcherRunner:85
: Job Fetcher: 0 should running, 0 actual running, 0 stopped, 0 ready, 0 already succeed,
1 error, 0 discarded, 0 others
2019-05-04 17:39:21,322 ERROR [main] provision.BuildCubeWithEngine:254 : error
java.lang.RuntimeException: The test 'testInnerJoinCube' is failed.
at org.apache.kylin.provision.BuildCubeWithEngine.runTestAndAssertSucceed(BuildCubeWithEngine.java:250)
at org.apache.kylin.provision.BuildCubeWithEngine.testCase(BuildCubeWithEngine.java:234)
at org.apache.kylin.provision.BuildCubeWithEngine.build(BuildCubeWithEngine.java:211)
at org.apache.kylin.provision.BuildCubeWithEngine.main(BuildCubeWithEngine.java:100)
Time elapsed: 618 sec - in org.apache.kylin.provision.BuildCubeWithEngine
2019-05-04 17:39:21,323 ERROR [main] provision.BuildCubeWithEngine:106 : error
java.lang.RuntimeException: The test 'testInnerJoinCube' is failed.
at org.apache.kylin.provision.BuildCubeWithEngine.runTestAndAssertSucceed(BuildCubeWithEngine.java:250)
at org.apache.kylin.provision.BuildCubeWithEngine.testCase(BuildCubeWithEngine.java:234)
at org.apache.kylin.provision.BuildCubeWithEngine.build(BuildCubeWithEngine.java:211)
at org.apache.kylin.provision.BuildCubeWithEngine.main(BuildCubeWithEngine.java:100)
2019-05-04 17:39:21,327 INFO  [Thread-13] util.ZKUtil:93 : Going to remove 1 cached curator
clients
2019-05-04 17:39:21,337 INFO  [close-hbase-conn] hbase.HBaseConnection:137 : Closing HBase
connections...
2019-05-04 17:39:21,338 INFO  [close-hbase-conn] client.ConnectionManager$HConnectionImplementation:2068
: Closing master protocol: MasterService
2019-05-04 17:39:21,345 INFO  [Thread-13] util.ZKUtil:78 : CuratorFramework for zkString sandbox.hortonworks.com
is removed due to EXPLICIT
2019-05-04 17:39:21,348 INFO  [close-hbase-conn] client.ConnectionManager$HConnectionImplementation:1676
: Closing zookeeper sessionid=0x16a7d4772350019
2019-05-04 17:39:21,349 INFO  [Curator-Framework-0] imps.CuratorFrameworkImpl:821 : backgroundOperationsLoop
exiting
2019-05-04 17:39:21,352 INFO  [close-hbase-conn] zookeeper.ZooKeeper:693 : Session: 0x16a7d4772350019
closed
2019-05-04 17:39:21,352 INFO  [main-EventThread] zookeeper.ClientCnxn:522 : EventThread shut
down for session: 0x16a7d4772350019
2019-05-04 17:39:21,367 INFO  [Thread-13] zookeeper.ZooKeeper:693 : Session: 0x16a7d477235001a
closed
2019-05-04 17:39:21,367 INFO  [main-EventThread] zookeeper.ClientCnxn:522 : EventThread shut
down for session: 0x16a7d477235001a


==================


| |
yuzhang
|
|
shifengdefannao@163.com
|
签名由网易邮箱大师定制
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message