Return-Path: X-Original-To: apmail-tajo-dev-archive@minotaur.apache.org Delivered-To: apmail-tajo-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 46E87104B3 for ; Fri, 28 Nov 2014 04:10:59 +0000 (UTC) Received: (qmail 74170 invoked by uid 500); 28 Nov 2014 04:10:59 -0000 Delivered-To: apmail-tajo-dev-archive@tajo.apache.org Received: (qmail 74119 invoked by uid 500); 28 Nov 2014 04:10:59 -0000 Mailing-List: contact dev-help@tajo.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@tajo.apache.org Delivered-To: mailing list dev@tajo.apache.org Received: (qmail 74107 invoked by uid 99); 28 Nov 2014 04:10:59 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 28 Nov 2014 04:10:59 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of maihaithanh@gmail.com designates 209.85.192.46 as permitted sender) Received: from [209.85.192.46] (HELO mail-qg0-f46.google.com) (209.85.192.46) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 28 Nov 2014 04:10:31 +0000 Received: by mail-qg0-f46.google.com with SMTP id z107so4235829qgd.5 for ; Thu, 27 Nov 2014 20:10:27 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=5ZXifLg3F8f97TSwH2eWlMffixlepmbRxF2fd89nu8Y=; b=aWq+a4XDpdwUA8xyWg2dnlshmsBGFpvFSSjPvOphR88ZochYSS6N+iJG9zD9GtU6yl qS2zlpOVV/6FdW6dGSwfDKA7dNwH1nStOsNKfqpClP3HxhuzFE/3b2Nc2p/ut3EA3ibx 3JH5GdMW6urOw+KvMU5xRfPKRlP5F2qv9pYockHqEXEIJC0PdRlxPDA6ir1XqffTv5VV J2R6kyyXxbK/6HvdU7vJR9cC1XCfQnK0Edkq16/ciGACpfNWegmnpqYXV/LyYsN7sCyh coN7QzeCtfrHndRZWLynN0d9eUx/h7o3ZPFKi1NNqg3d72vGzN6gsLyMJkDdihg+cUkr ZYVg== X-Received: by 10.140.102.169 with SMTP id w38mr58669932qge.95.1417147827692; Thu, 27 Nov 2014 20:10:27 -0800 (PST) MIME-Version: 1.0 Received: by 10.140.38.139 with HTTP; Thu, 27 Nov 2014 20:10:07 -0800 (PST) In-Reply-To: References: From: Thanh Mai Date: Fri, 28 Nov 2014 13:10:07 +0900 Message-ID: Subject: Re: Tajo with TPC-H benchmark To: dev@tajo.apache.org Content-Type: multipart/alternative; boundary=001a11c16c78dd5ae20508e37009 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c16c78dd5ae20508e37009 Content-Type: text/plain; charset=UTF-8 Hi Hyunsik, I have already modified TPC-H queries to be able to run with Tajo. If the queries are issued sequentially, there is no problem. However, if they are issued concurrently, some queries will fail sometimes (not always). I cannot share the log file. Instead, I can share the error messages below. It seems that the problem is not directly related to Tajo, but to Thrift or ProtoBuf. I have also increased the value of "hive.metastore.client.socket.timeout" from 20 to 200 and "hive.server.read.socket.timeout" from 10 to 100 in hive-site.xml, but the problem remains. Thank you for your help! Sincerely, Mai Hai Thanh ---------------------------------------------------------------------------------------------------------------------------------------------- 2014-11-28 11:09:38,535 ERROR hive.log: Got exception: org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException: Read timed out org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:354) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_all_databases(ThriftHiveMetastore.java:625) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:613) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:837) at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at com.sun.proxy.$Proxy12.getAllDatabases(Unknown Source) at org.apache.tajo.catalog.store.HCatalogStore.existDatabase(HCatalogStore.java:369) at org.apache.tajo.catalog.CatalogServer$CatalogProtocolHandler.existDatabase(CatalogServer.java:414) at org.apache.tajo.catalog.AbstractCatalogClient$9.call(AbstractCatalogClient.java:215) at org.apache.tajo.catalog.AbstractCatalogClient$9.call(AbstractCatalogClient.java:212) at org.apache.tajo.rpc.ServerCallable.withRetries(ServerCallable.java:95) at org.apache.tajo.catalog.AbstractCatalogClient.existDatabase(AbstractCatalogClient.java:212) at org.apache.tajo.master.TajoMasterClientService$TajoMasterClientProtocolServiceHandler.existDatabase(TajoMasterClientService.java:624) at org.apache.tajo.ipc.TajoMasterClientProtocol$TajoMasterClientProtocolService$2.callBlockingMethod(TajoMasterClientProtocol.java:549) at org.apache.tajo.rpc.BlockingRpcServer$ServerHandler.messageReceived(BlockingRpcServer.java:103) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:109) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:90) at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.net.SocketTimeoutException: Read timed out at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.read(SocketInputStream.java:152) at java.net.SocketInputStream.read(SocketInputStream.java:122) at java.io.BufferedInputStream.read1(BufferedInputStream.java:273) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ... 47 more 2014-11-28 11:09:38,538 ERROR hive.log: Converting exception to MetaException 2014-11-28 11:09:38,550 ERROR org.apache.tajo.catalog.CatalogServer: org.apache.tajo.catalog.exception.CatalogException: MetaException(message:Got exception: org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException: Read timed out) 2014-11-28 11:09:38,550 ERROR org.apache.tajo.catalog.AbstractCatalogClient: com.google.protobuf.ServiceException: org.apache.tajo.catalog.exception.CatalogException: MetaException(message:Got exception: org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException: Read timed out) com.google.protobuf.ServiceException: com.google.protobuf.ServiceException: org.apache.tajo.catalog.exception.CatalogException: MetaException(message:Got exception: org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException: Read timed out) at org.apache.tajo.rpc.ServerCallable.withRetries(ServerCallable.java:105) at org.apache.tajo.catalog.AbstractCatalogClient.existDatabase(AbstractCatalogClient.java:212) at org.apache.tajo.master.TajoMasterClientService$TajoMasterClientProtocolServiceHandler.existDatabase(TajoMasterClientService.java:624) at org.apache.tajo.ipc.TajoMasterClientProtocol$TajoMasterClientProtocolService$2.callBlockingMethod(TajoMasterClientProtocol.java:549) at org.apache.tajo.rpc.BlockingRpcServer$ServerHandler.messageReceived(BlockingRpcServer.java:103) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:109) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:90) at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: com.google.protobuf.ServiceException: org.apache.tajo.catalog.exception.CatalogException: MetaException(message:Got exception: org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException: Read timed out) at org.apache.tajo.catalog.CatalogServer$CatalogProtocolHandler.existDatabase(CatalogServer.java:421) at org.apache.tajo.catalog.AbstractCatalogClient$9.call(AbstractCatalogClient.java:215) at org.apache.tajo.catalog.AbstractCatalogClient$9.call(AbstractCatalogClient.java:212) at org.apache.tajo.rpc.ServerCallable.withRetries(ServerCallable.java:95) ... 30 more Caused by: org.apache.tajo.catalog.exception.CatalogException: MetaException(message:Got exception: org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException: Read timed out) at org.apache.tajo.catalog.store.HCatalogStore.existDatabase(HCatalogStore.java:372) at org.apache.tajo.catalog.CatalogServer$CatalogProtocolHandler.existDatabase(CatalogServer.java:414) ... 33 more Caused by: MetaException(message:Got exception: org.apache.thrift.transport.TTransportException java.net.SocketTimeoutException: Read timed out) at org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:1102) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:839) at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at com.sun.proxy.$Proxy12.getAllDatabases(Unknown Source) at org.apache.tajo.catalog.store.HCatalogStore.existDatabase(HCatalogStore.java:369) ... 34 more ERROR: Exception was thrown. Caused by org.apache.tajo.cli.InvalidClientSessionException: ERROR: database "default" does not exist Exception in thread "main" com.google.protobuf.ServiceException: org.apache.tajo.cli.InvalidClientSessionException: ERROR: database "default" does not exist at org.apache.tajo.rpc.ServerCallable.withRetries(ServerCallable.java:105) at org.apache.tajo.client.TajoClient.getCurrentDatabase(TajoClient.java:262) at org.apache.tajo.cli.TajoCli.(TajoCli.java:243) at org.apache.tajo.cli.TajoCli.main(TajoCli.java:690) Caused by: org.apache.tajo.cli.InvalidClientSessionException: ERROR: database "default" does not exist at org.apache.tajo.client.TajoClient.checkSessionAndGet(TajoClient.java:249) at org.apache.tajo.client.TajoClient.access$000(TajoClient.java:75) at org.apache.tajo.client.TajoClient$1.call(TajoClient.java:265) at org.apache.tajo.client.TajoClient$1.call(TajoClient.java:262) at org.apache.tajo.rpc.ServerCallable.withRetries(ServerCallable.java:95) ... 3 more ------------------------------------------------------------------------------------------------------------------------------------------------- On Fri, Nov 28, 2014 at 10:36 AM, Hyunsik Choi wrote: > Hi Mai, > > Good to see you again! > > So far, we haven't performed QphH-based benchmark. > > Some of original TPC-H benchmark queries may not work in Tajo because > scalar subquery, in-subquery, exist-subquery are not supported yet. > So, you need to change slightly them into multiple queries. > > I haven't experienced such a situation you explained, could you share > the error log that you met while you were executing multiple queries? > > Best regards, > Hyunsik > > > > On Tue, Nov 25, 2014 at 12:12 PM, Thanh Mai wrote: > > Hi everyone, > > > > I want to find the performance results of Tajo with the TPC-H benchmark > in > > both power test and throughput test (and thus, a sample QphH value for > > reference). However, from the slides in Tajo website, I found only the > > response times of some queries. > > > > Have any of you tested Tajo in the TPC-H benchmark's throughput test or > in > > the multi-user situation ? If yes, could you share your experience ? > > > > When I test Tajo with the TPC-H benchmark's throughput test, some queries > > failed occasionally. I use Hive catalog instead of Tajo catalog. It seems > > that Tajo has a problem when getting information from Hive catalog > > concurrently for multiple queries running in parallel. Have any of you > > experienced a similar situation ? Could you share your experience and > > insights ? > > > > Thank you very much! > > > > Sincerely, > > > > Mai Hai Thanh > --001a11c16c78dd5ae20508e37009--