kylin-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Billy Liu <billy...@apache.org>
Subject Re: cube run 3 setp:org.apache.thrift.transport.TTransportException
Date Sun, 05 Nov 2017 02:59:58 GMT
Try copy all properties in hivemetastore-site.xml into hive-site.xml

2017-11-04 22:57 GMT+08:00 ShaoFeng Shi <shaofengshi@apache.org>:

> Hi,
>
> The pre-built Kylin binary package doesn't support FusionInsight I think.
> At least you need re-compile (with FI's Hadoop/HBase/Hive version) and also
> do some tunning with FI specific configurations. That is what I heard from
> some user.
>
> 2017-11-04 21:58 GMT+08:00 apache_dev@163.com <apache_dev@163.com>:
>
> > Hi,
> >
> >    Kylin2.1 on Huawei  FusionInsight C60.
> >
> >
> >   when cube run #3 Step Name: Extract Fact Table Distinct Columns ,it get
> > error,What's the reason?
> >
> > 2017-11-04 21:25:09,703 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:480 : Trying to connect to metastore with URI thrift://
> > 10.10.20.54:21088
> > 2017-11-04 21:25:09,744 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:525 : Opened a connection to metastore, current
> connections:
> > 1
> > 2017-11-04 21:25:09,801 WARN  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:576 : set_ugi() not successful, Likely cause: new client
> > talking to old server. Continuing without it.
> > org.apache.thrift.transport.TTransportException
> >         at org.apache.thrift.transport.TIOStreamTransport.read(
> > TIOStreamTransport.java:132)
> >         at org.apache.thrift.transport.TTransport.readAll(TTransport.
> > java:86)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(
> > TBinaryProtocol.java:380)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
> > TBinaryProtocol.java:230)
> >         at org.apache.thrift.TServiceClient.receiveBase(
> > TServiceClient.java:77)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3794)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3780)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > open(HiveMetaStoreClient.java:568)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > <init>(HiveMetaStoreClient.java:249)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > <init>(HiveMetaStoreClient.java:194)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$
> > CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330)
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > NativeConstructorAccessorImpl.java:62)
> >         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >         at java.lang.reflect.Constructor.newInstance(Constructor.java:
> 423)
> >         at org.apache.hadoop.hive.metastore.MetaStoreUtils.
> > newInstance(MetaStoreUtils.java:1530)
> >         at org.apache.hadoop.hive.metastore.
> RetryingMetaStoreClient.<init>
> > (RetryingMetaStoreClient.java:87)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > getProxy(RetryingMetaStoreClient.java:133)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > getProxy(RetryingMetaStoreClient.java:119)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$5.call(
> > HiveClientCache.java:231)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$5.call(
> > HiveClientCache.java:227)
> >         at com.google.common.cache.LocalCache$LocalManualCache$1.
> > load(LocalCache.java:4791)
> >         at com.google.common.cache.LocalCache$LoadingValueReference.
> > loadFuture(LocalCache.java:3584)
> >         at com.google.common.cache.LocalCache$Segment.loadSync(
> > LocalCache.java:2372)
> >         at com.google.common.cache.LocalCache$Segment.
> > lockedGetOrLoad(LocalCache.java:2335)
> >         at com.google.common.cache.LocalCache$Segment.get(
> > LocalCache.java:2250)
> >         at com.google.common.cache.LocalCache.get(LocalCache.java:3985)
> >         at com.google.common.cache.LocalCache$LocalManualCache.
> > get(LocalCache.java:4788)
> >         at org.apache.hive.hcatalog.common.HiveClientCache.
> > getOrCreate(HiveClientCache.java:227)
> >         at org.apache.hive.hcatalog.common.HiveClientCache.get(
> > HiveClientCache.java:202)
> >         at org.apache.hive.hcatalog.common.HCatUtil.
> > getHiveMetastoreClient(HCatUtil.java:558)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > getInputJobInfo(InitializeInput.java:104)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > setInput(InitializeInput.java:88)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:95)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:51)
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:110)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > setupMapper(FactDistinctColumnsJob.java:141)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > run(FactDistinctColumnsJob.java:119)
> >         at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:122)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> doWork(
> > DefaultChainedExecutable.java:65)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:141)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > 2017-11-04 21:25:09,803 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:604 : Connected to metastore.
> > 2017-11-04 21:25:09,976 ERROR [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.log:1220 : Got exception: org.apache.thrift.transport.
> TTransportException
> > null
> > org.apache.thrift.transport.TTransportException
> >         at org.apache.thrift.transport.TIOStreamTransport.read(
> > TIOStreamTransport.java:132)
> >         at org.apache.thrift.transport.TTransport.readAll(TTransport.
> > java:86)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readAll(
> > TBinaryProtocol.java:429)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readI32(
> > TBinaryProtocol.java:318)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
> > TBinaryProtocol.java:219)
> >         at org.apache.thrift.TServiceClient.receiveBase(
> > TServiceClient.java:77)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.recv_get_databases(
> > ThriftHiveMetastore.java:730)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.get_databases(ThriftHiveMetastore.java:717)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > getDatabases(HiveMetaStoreClient.java:1169)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$
> > CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:367)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke(
> > NativeMethodAccessorImpl.java:62)
> >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > invoke(RetryingMetaStoreClient.java:153)
> >         at com.sun.proxy.$Proxy58.isOpen(Unknown Source)
> >         at org.apache.hive.hcatalog.common.HiveClientCache.get(
> > HiveClientCache.java:205)
> >         at org.apache.hive.hcatalog.common.HCatUtil.
> > getHiveMetastoreClient(HCatUtil.java:558)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > getInputJobInfo(InitializeInput.java:104)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > setInput(InitializeInput.java:88)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:95)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:51)
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:110)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > setupMapper(FactDistinctColumnsJob.java:141)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > run(FactDistinctColumnsJob.java:119)
> >         at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:122)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> doWork(
> > DefaultChainedExecutable.java:65)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:141)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > 2017-11-04 21:25:09,977 ERROR [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.log:1221 : Converting exception to MetaException
> > 2017-11-04 21:25:09,991 WARN  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > transport.TIOStreamTransport:112 : Error closing output stream.
> > java.net.SocketException: Socket closed
> >         at java.net.SocketOutputStream.socketWrite(
> > SocketOutputStream.java:116)
> >         at java.net.SocketOutputStream.write(SocketOutputStream.java:
> 153)
> >         at java.io.BufferedOutputStream.flushBuffer(
> > BufferedOutputStream.java:82)
> >         at java.io.BufferedOutputStream.flush(BufferedOutputStream.
> > java:140)
> >         at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
> >         at org.apache.thrift.transport.TIOStreamTransport.close(
> > TIOStreamTransport.java:110)
> >         at org.apache.thrift.transport.TSocket.close(TSocket.java:235)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > close(HiveMetaStoreClient.java:632)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$
> > CacheableHiveMetaStoreClient.tearDown(HiveClientCache.java:403)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$
> > CacheableHiveMetaStoreClient.tearDownIfUnused(HiveClientCache.java:393)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$
> > CacheableHiveMetaStoreClient.close(HiveClientCache.java:383)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke(
> > NativeMethodAccessorImpl.java:62)
> >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > invoke(RetryingMetaStoreClient.java:153)
> >         at com.sun.proxy.$Proxy58.close(Unknown Source)
> >         at org.apache.hive.hcatalog.common.HiveClientCache.get(
> > HiveClientCache.java:208)
> >         at org.apache.hive.hcatalog.common.HCatUtil.
> > getHiveMetastoreClient(HCatUtil.java:558)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > getInputJobInfo(InitializeInput.java:104)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > setInput(InitializeInput.java:88)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:95)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:51)
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:110)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > setupMapper(FactDistinctColumnsJob.java:141)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > run(FactDistinctColumnsJob.java:119)
> >         at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:122)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> doWork(
> > DefaultChainedExecutable.java:65)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:141)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > 2017-11-04 21:25:09,993 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:633 : Closed a connection to metastore, current
> connections:
> > 0
> > 2017-11-04 21:25:09,993 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:480 : Trying to connect to metastore with URI thrift://
> > 10.10.20.54:21088
> > 2017-11-04 21:25:09,994 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:525 : Opened a connection to metastore, current
> connections:
> > 1
> > 2017-11-04 21:25:10,055 WARN  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:576 : set_ugi() not successful, Likely cause: new client
> > talking to old server. Continuing without it.
> > org.apache.thrift.transport.TTransportException
> >         at org.apache.thrift.transport.TIOStreamTransport.read(
> > TIOStreamTransport.java:132)
> >         at org.apache.thrift.transport.TTransport.readAll(TTransport.
> > java:86)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(
> > TBinaryProtocol.java:380)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
> > TBinaryProtocol.java:230)
> >         at org.apache.thrift.TServiceClient.receiveBase(
> > TServiceClient.java:77)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3794)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3780)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > open(HiveMetaStoreClient.java:568)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > <init>(HiveMetaStoreClient.java:249)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > <init>(HiveMetaStoreClient.java:194)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$
> > CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:330)
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > NativeConstructorAccessorImpl.java:62)
> >         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >         at java.lang.reflect.Constructor.newInstance(Constructor.java:
> 423)
> >         at org.apache.hadoop.hive.metastore.MetaStoreUtils.
> > newInstance(MetaStoreUtils.java:1530)
> >         at org.apache.hadoop.hive.metastore.
> RetryingMetaStoreClient.<init>
> > (RetryingMetaStoreClient.java:87)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > getProxy(RetryingMetaStoreClient.java:133)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > getProxy(RetryingMetaStoreClient.java:119)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$5.call(
> > HiveClientCache.java:231)
> >         at org.apache.hive.hcatalog.common.HiveClientCache$5.call(
> > HiveClientCache.java:227)
> >         at com.google.common.cache.LocalCache$LocalManualCache$1.
> > load(LocalCache.java:4791)
> >         at com.google.common.cache.LocalCache$LoadingValueReference.
> > loadFuture(LocalCache.java:3584)
> >         at com.google.common.cache.LocalCache$Segment.loadSync(
> > LocalCache.java:2372)
> >         at com.google.common.cache.LocalCache$Segment.
> > lockedGetOrLoad(LocalCache.java:2335)
> >         at com.google.common.cache.LocalCache$Segment.get(
> > LocalCache.java:2250)
> >         at com.google.common.cache.LocalCache.get(LocalCache.java:3985)
> >         at com.google.common.cache.LocalCache$LocalManualCache.
> > get(LocalCache.java:4788)
> >         at org.apache.hive.hcatalog.common.HiveClientCache.
> > getOrCreate(HiveClientCache.java:227)
> >         at org.apache.hive.hcatalog.common.HiveClientCache.get(
> > HiveClientCache.java:209)
> >         at org.apache.hive.hcatalog.common.HCatUtil.
> > getHiveMetastoreClient(HCatUtil.java:558)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > getInputJobInfo(InitializeInput.java:104)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > setInput(InitializeInput.java:88)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:95)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:51)
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:110)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > setupMapper(FactDistinctColumnsJob.java:141)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > run(FactDistinctColumnsJob.java:119)
> >         at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:122)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> doWork(
> > DefaultChainedExecutable.java:65)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:141)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > 2017-11-04 21:25:10,057 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:604 : Connected to metastore.
> > 2017-11-04 21:25:10,081 WARN  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > metastore.RetryingMetaStoreClient:191 : MetaStoreClient lost connection.
> > Attempting to reconnect.
> > org.apache.thrift.transport.TTransportException
> >         at org.apache.thrift.transport.TIOStreamTransport.read(
> > TIOStreamTransport.java:132)
> >         at org.apache.thrift.transport.TTransport.readAll(TTransport.
> > java:86)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readAll(
> > TBinaryProtocol.java:429)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readI32(
> > TBinaryProtocol.java:318)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
> > TBinaryProtocol.java:219)
> >         at org.apache.thrift.TServiceClient.receiveBase(
> > TServiceClient.java:77)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1260)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1246)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > getTable(HiveMetaStoreClient.java:1357)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke(
> > NativeMethodAccessorImpl.java:62)
> >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > invoke(RetryingMetaStoreClient.java:153)
> >         at com.sun.proxy.$Proxy58.getTable(Unknown Source)
> >         at org.apache.hive.hcatalog.common.HCatUtil.getTable(
> > HCatUtil.java:180)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > getInputJobInfo(InitializeInput.java:105)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > setInput(InitializeInput.java:88)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:95)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:51)
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:110)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > setupMapper(FactDistinctColumnsJob.java:141)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > run(FactDistinctColumnsJob.java:119)
> >         at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:122)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> doWork(
> > DefaultChainedExecutable.java:65)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:141)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > 2017-11-04 21:25:11,082 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:289 : Invoke failed on metastore: thrift://
> > 10.10.20.54:21088. Add it to blacklist.
> > 2017-11-04 21:25:11,083 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:480 : Trying to connect to metastore with URI thrift://
> > 10.10.20.53:21088
> > 2017-11-04 21:25:11,084 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:525 : Opened a connection to metastore, current
> connections:
> > 2
> > 2017-11-04 21:25:11,109 WARN  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:576 : set_ugi() not successful, Likely cause: new client
> > talking to old server. Continuing without it.
> > org.apache.thrift.transport.TTransportException
> >         at org.apache.thrift.transport.TIOStreamTransport.read(
> > TIOStreamTransport.java:132)
> >         at org.apache.thrift.transport.TTransport.readAll(TTransport.
> > java:86)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(
> > TBinaryProtocol.java:380)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
> > TBinaryProtocol.java:230)
> >         at org.apache.thrift.TServiceClient.receiveBase(
> > TServiceClient.java:77)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:3794)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:3780)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > open(HiveMetaStoreClient.java:568)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > reconnect(HiveMetaStoreClient.java:343)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > invoke(RetryingMetaStoreClient.java:149)
> >         at com.sun.proxy.$Proxy58.getTable(Unknown Source)
> >         at org.apache.hive.hcatalog.common.HCatUtil.getTable(
> > HCatUtil.java:180)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > getInputJobInfo(InitializeInput.java:105)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > setInput(InitializeInput.java:88)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:95)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:51)
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:110)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > setupMapper(FactDistinctColumnsJob.java:141)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > run(FactDistinctColumnsJob.java:119)
> >         at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:122)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> doWork(
> > DefaultChainedExecutable.java:65)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:141)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > 2017-11-04 21:25:11,110 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > hive.metastore:604 : Connected to metastore.
> > 2017-11-04 21:25:11,114 ERROR [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > common.MapReduceExecutable:129 : error execute MapReduceExecutable{id=
> > 8557407a-8e87-49aa-9cff-b7e3f58c2f12-02, name=Extract Fact Table
> Distinct
> > Columns, state=RUNNING}
> > java.lang.RuntimeException: java.io.IOException:
> > org.apache.thrift.transport.TTransportException
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:113)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > setupMapper(FactDistinctColumnsJob.java:141)
> >         at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.
> > run(FactDistinctColumnsJob.java:119)
> >         at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> >         at org.apache.kylin.engine.mr.common.MapReduceExecutable.
> > doWork(MapReduceExecutable.java:122)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.execution.DefaultChainedExecutable.
> doWork(
> > DefaultChainedExecutable.java:65)
> >         at org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:125)
> >         at org.apache.kylin.job.impl.threadpool.DefaultScheduler$
> > JobRunner.run(DefaultScheduler.java:141)
> >         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.IOException: org.apache.thrift.transport.
> > TTransportException
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:97)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:51)
> >         at org.apache.kylin.source.hive.HiveMRInput$
> HiveTableInputFormat.
> > configureJob(HiveMRInput.java:110)
> >         ... 11 more
> > Caused by: org.apache.thrift.transport.TTransportException
> >         at org.apache.thrift.transport.TIOStreamTransport.read(
> > TIOStreamTransport.java:132)
> >         at org.apache.thrift.transport.TTransport.readAll(TTransport.
> > java:86)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readAll(
> > TBinaryProtocol.java:429)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readI32(
> > TBinaryProtocol.java:318)
> >         at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(
> > TBinaryProtocol.java:219)
> >         at org.apache.thrift.TServiceClient.receiveBase(
> > TServiceClient.java:77)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1260)
> >         at org.apache.hadoop.hive.metastore.api.
> > ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1246)
> >         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.
> > getTable(HiveMetaStoreClient.java:1357)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke(
> > NativeMethodAccessorImpl.java:62)
> >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >         at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.
> > invoke(RetryingMetaStoreClient.java:153)
> >         at com.sun.proxy.$Proxy58.getTable(Unknown Source)
> >         at org.apache.hive.hcatalog.common.HCatUtil.getTable(
> > HCatUtil.java:180)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > getInputJobInfo(InitializeInput.java:105)
> >         at org.apache.hive.hcatalog.mapreduce.InitializeInput.
> > setInput(InitializeInput.java:88)
> >         at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.
> > setInput(HCatInputFormat.java:95)
> >         ... 13 more
> > 2017-11-04 21:25:11,307 INFO  [Job 8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-102]
> > execution.ExecutableManager:425 : job id:8557407a-8e87-49aa-9cff-
> b7e3f58c2f12-02
> > from RUNNING to ERROR
> >
> >
> >
> >
> >
> >
> > apache_dev@163.com
> >
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message