Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 4B015200B58 for ; Wed, 27 Jul 2016 21:39:04 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 40B81160A90; Wed, 27 Jul 2016 19:39:04 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 60A14160A6F for ; Wed, 27 Jul 2016 21:39:03 +0200 (CEST) Received: (qmail 21317 invoked by uid 500); 27 Jul 2016 19:39:02 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 21303 invoked by uid 99); 27 Jul 2016 19:39:02 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 27 Jul 2016 19:39:02 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 9DB7FC0373 for ; Wed, 27 Jul 2016 19:39:01 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.179 X-Spam-Level: * X-Spam-Status: No, score=1.179 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id f3KmAIfBX-US for ; Wed, 27 Jul 2016 19:38:59 +0000 (UTC) Received: from mail-qk0-f169.google.com (mail-qk0-f169.google.com [209.85.220.169]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id EA53B5FCE5 for ; Wed, 27 Jul 2016 19:38:58 +0000 (UTC) Received: by mail-qk0-f169.google.com with SMTP id x1so43837979qkb.3 for ; Wed, 27 Jul 2016 12:38:58 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=OPbYPZgQ5ZYOmROIpuAtnV8v6GmG09SBTTTvCppGSO4=; b=LoV+zdIfJ0bTlNm2DhzlK3+d47L/RSSEc3mpkQrY6UFFBb/EYM7g7HAlvkn4pxbz1U Tf5ZHPEI+V4i0xgxw94E6Jw5s/IaDF5tciYVIfaur/JrqZeFejhBiu3TlGg6CDNeFPxp YmeGA32nYkjfsFqb7JnnQed4Hut26rKj19kNgr0/CM7Y74+vEmJFjjVjtqbhd/5teu+V ADlreQ5APwk8u7F5OZ+O1OrYZY9+UlqwjYrLI8A8br7xEFvcEUF083A42YBM3CZ5ronS +CnpwVyeIZbfVLKRlu9fc9u59VyWmW9XEKDBJ0UMXFfRiv8MFkbLsAUaaAlNazymFJtF dI8w== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=OPbYPZgQ5ZYOmROIpuAtnV8v6GmG09SBTTTvCppGSO4=; b=Gy1m6L06AvbgfaneiSB4AhxJj0sHBXfRCejQwb8ESeFB1y/S2NjxtbBhjRgNs6XkWi 9W95nmv84dZl/0Qly33oywzv1gfo1wFr8upVCcwVr/beBE2kQ8hXM5xEE0PbHn1/4Zsu lPH/gR8Mua3pmPDoVNfKgAY/ofLAJYcjsNiIByJ/hT83lV79d5LS9IOJuel01sW3vkN4 mQlDccE42BhsydAljigK6G+YLlSODQWsvmo6S9oPMgFqnQnhSWa6QiM15oUZ0ggYc62o PnBDB0/sIB+g1sD0YzZEsrLwlnekIv9bY8atquJCZM0J/KP8vqYVkNGVC2CcIu/upB/6 RP8Q== X-Gm-Message-State: AEkooutXgXTxqv/N9hFu3alK9mnooXf+OJi3J2B6oaXwhiGKIc6feFxQJUHTzp5vxblx7UxzSHO97hBmI5gzjg== X-Received: by 10.55.155.22 with SMTP id d22mr39673385qke.103.1469648338413; Wed, 27 Jul 2016 12:38:58 -0700 (PDT) MIME-Version: 1.0 Received: by 10.55.143.129 with HTTP; Wed, 27 Jul 2016 12:38:57 -0700 (PDT) In-Reply-To: <524F0A39-70C8-4FEC-969C-49F37F969199@askme.in> References: <524F0A39-70C8-4FEC-969C-49F37F969199@askme.in> From: Mich Talebzadeh Date: Wed, 27 Jul 2016 20:38:57 +0100 Message-ID: Subject: Re: Hive on spark To: user Content-Type: multipart/alternative; boundary=001a114fd3ae27ef410538a32b7e archived-at: Wed, 27 Jul 2016 19:39:04 -0000 --001a114fd3ae27ef410538a32b7e Content-Type: text/plain; charset=UTF-8 You mean you want to run Hive using Spark as the execution engine which uses Yarn by default? Something like below hive> select max(id) from oraclehadoop.dummy_parquet; Starting Spark Job = 8218859d-1d7c-419c-adc7-4de175c3ca6d Query Hive on Spark job[1] stages: 2 3 Status: Running (Hive on Spark job[1]) Job Progress Format CurrentTime StageId_StageAttemptId: SucceededTasksCount(+RunningTasksCount-FailedTasksCount)/TotalTasksCount [StageCost] 2016-07-27 20:38:17,269 Stage-2_0: 0(+8)/24 Stage-3_0: 0/1 2016-07-27 20:38:20,298 Stage-2_0: 8(+4)/24 Stage-3_0: 0/1 2016-07-27 20:38:22,309 Stage-2_0: 11(+1)/24 Stage-3_0: 0/1 2016-07-27 20:38:23,330 Stage-2_0: 12(+8)/24 Stage-3_0: 0/1 2016-07-27 20:38:26,360 Stage-2_0: 17(+7)/24 Stage-3_0: 0/1 2016-07-27 20:38:27,386 Stage-2_0: 20(+4)/24 Stage-3_0: 0/1 2016-07-27 20:38:28,391 Stage-2_0: 21(+3)/24 Stage-3_0: 0/1 2016-07-27 20:38:29,395 Stage-2_0: 24/24 Finished Stage-3_0: 1/1 Finished Status: Finished successfully in 13.14 seconds OK 100000000 Time taken: 13.426 seconds, Fetched: 1 row(s) HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On 27 July 2016 at 20:31, Mudit Kumar wrote: > Hi All, > > I need to configure hive cluster based on spark engine (yarn). > I already have a running hadoop cluster. > > Can someone point me to relevant documentation? > > TIA. > > Thanks, > Mudit > --001a114fd3ae27ef410538a32b7e Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
You mean you want to run Hive using Spark as the exec= ution engine which uses Yarn by default?


Something like below

hive> select max(id) from oraclehadoop.dum= my_parquet;
Starting Spark Job =3D 8218859d-1d7c-419c-adc7-4de175c3ca6d<= /font>
Query= Hive on Spark job[1] stages:
2
3
Status: Running (Hive on Spark job[1])Job Progress Format
CurrentTime StageId_StageAttemptId: SucceededTasks= Count(+RunningTasksCount-FailedTasksCount)/TotalTasksCount [StageCost]
2= 016-07-27 20:38:17,269 Stage-2_0: 0(+8)/24=C2=A0=C2=A0=C2=A0=C2=A0 Stage-3_= 0: 0/1
2016-07-27 20:38:20,298 Stage-2_0: 8(+4)/24=C2=A0=C2=A0=C2=A0=C2= =A0 Stage-3_0: 0/1
2016-07-27 20:38:22,309 Stage-2_0: 11(+1)/24=C2=A0=C2= =A0=C2=A0 Stage-3_0: 0/1
2016-07-27 20:38:23,330 Stage-2_0: 12(+8)/24=C2= =A0=C2=A0=C2=A0 Stage-3_0: 0/1
2016-07-27 20:38:26,360 Stage-2_0: 17(+7)= /24=C2=A0=C2=A0=C2=A0 Stage-3_0: 0/1
2016-07-27 20:38:27,386 Stage-2_0: = 20(+4)/24=C2=A0=C2=A0=C2=A0 Stage-3_0: 0/1
2016-07-27 20:38:28,391 Stage= -2_0: 21(+3)/24=C2=A0=C2=A0=C2=A0 Stage-3_0: 0/1
2016-07-27 20:38:29,395= Stage-2_0: 24/24 Finished=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 Stage-3_0: 1= /1 Finished
Status: Finished successfully in 13.14 seconds
OK
1000= 00000
Time taken: 13.426 seconds, Fetched: 1 row(s)

HTH

Dr Mich Talebzadeh

=C2=A0

LinkedIn =C2=A0https://www.linkedin.com/profile/view?id=3DAAEA= AAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

=C2=A0

http:= //talebzadehmich.wordpress.com


Disclaimer:=C2=A0Use = it=C2=A0at your own risk. Any and all responsibilit= y for any loss, damage or destruction of data or any other property which may arise from relying on this email= 9;s=C2=A0technical=C2=A0content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from = such loss, damage or destruction.

=C2=A0

<= font color=3D"#000000" face=3D"Times New Roman" size=3D"3">

On 27 July 2016 at 20:31, Mudit Kumar <m= udit.kumar@askme.in> wrote:
Hi All,

I need to configure hive cluster = based on spark engine (yarn).
I already have a running hadoop clu= ster.

Can someone point me to relevant documentati= on?

TIA.

Thanks,
Mudit

--001a114fd3ae27ef410538a32b7e--