Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 58506C69B for ; Mon, 14 May 2012 07:43:56 +0000 (UTC) Received: (qmail 41746 invoked by uid 500); 14 May 2012 07:43:52 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 40932 invoked by uid 500); 14 May 2012 07:43:51 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 40510 invoked by uid 99); 14 May 2012 07:43:50 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 May 2012 07:43:50 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of nitinpawar432@gmail.com designates 209.85.215.48 as permitted sender) Received: from [209.85.215.48] (HELO mail-lpp01m010-f48.google.com) (209.85.215.48) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 14 May 2012 07:43:44 +0000 Received: by lagz14 with SMTP id z14so4277589lag.35 for ; Mon, 14 May 2012 00:43:22 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type; bh=wZTPdoJt1OwicRvIdJfQ82lYsjZ6nm4WVDg8rCAsELA=; b=RiOoy3m5beWSjmtvVtK0Mo73V/Qz/G0a+OxACCJ7TEUOXsZ+UjKtWfPrbobbAN2LcL Qw8gw48kkvb4okxNSd8mWXEp6skgu1dyE/+UVZcvUZ14BY1DDvkYv4OmONVFag9WtYme tBfUfCEEUM3e0beIVOhfYW3wm5LEfC1BnVWE0lT5+b3kN77BgNuLvd1LfjCvT+TPzW+z +9RiR1XIaN+URX8vquXoSFeT7Ji83xHHU4MnSL7k9QO9U6XG/9taEgYu3cshQxPvpsw5 zMG75CrhxqmCZW4GD/9KUgoFDir9CeKID8I9sslg4r8j+FAea3/Q25Skk0kub6ZU5klE YG+w== MIME-Version: 1.0 Received: by 10.112.36.195 with SMTP id s3mr551318lbj.42.1336981402650; Mon, 14 May 2012 00:43:22 -0700 (PDT) Received: by 10.112.42.2 with HTTP; Mon, 14 May 2012 00:43:22 -0700 (PDT) In-Reply-To: References: Date: Mon, 14 May 2012 13:13:22 +0530 Message-ID: Subject: Re: Is my Use Case possible with Hive? From: Nitin Pawar To: common-dev@hadoop.apache.org Cc: user@hive.apache.org, dev@hive.apache.org, hadoop-dev@lucene.apache.org, dev@sqoop.apache.org Content-Type: multipart/alternative; boundary=e0cb4efe2ae09389ce04bffa3d20 X-Virus-Checked: Checked by ClamAV on apache.org --e0cb4efe2ae09389ce04bffa3d20 Content-Type: text/plain; charset=ISO-8859-1 how many # records? what is your hadoop cluster setup? how many nodes? if you are running hadoop on a single node setup with normal desktop, i doubt it will be of any help. You need a stronger cluster setup for better query runtimes and ofcourse query optimization which I guess you would have already taken care. On Mon, May 14, 2012 at 12:39 PM, Bhavesh Shah wrote: > Hello all, > My Use Case is: > 1) I have a relational database which has a very large data. (MS SQL > Server) > 2) I want to do analysis on these huge data and want to generate reports > on it after analysis. > Like this I have to generate various reports based on different analysis. > > I tried to implement this using Hive. What I did is: > 1) I imported all tables in Hive from MS SQL Server using SQOOP. > 2) I wrote many queries in Hive which is executing using JDBC on Hive > Thrift Server > 3) I am getting the correct result in table form, which I am expecting > 4) But the problem is that the time which require to execute is too much > long. > (My complete program is executing in near about 3-4 hours on *small > amount of data*). > > I decided to do this using Hive. > And as I told previously how much time Hive consumed for execution. my > organization is expecting to complete this task in near about less than > 1/2 hours > > Now after spending too much time for complete execution for this task what > should I do? > I want to ask one thing that: > *Is this Use Case is possible with Hive?* If possible what should I do in > my program to increase the performance? > *And If not possible what is the other good way to implement this Use > Case?* > > Please reply me. > Thanks > > > -- > Regards, > Bhavesh Shah > -- Nitin Pawar --e0cb4efe2ae09389ce04bffa3d20 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable how many # records?=A0

what is your hadoop cluster setup= ? how many nodes?=A0
if you are running hadoop on a single node s= etup with normal desktop, i doubt it will be of any help.

You need a stronger cluster setup for better query runtimes and ofcour= se query optimization which I guess you would have already taken care.



On Mon, May 14, 201= 2 at 12:39 PM, Bhavesh Shah <bhavesh25shah@gmail.com> = wrote:
Hello all,
My Use Case is:
1) I have a relational database which has a very large data. (MS SQL Server= )
2) I want to do analysis on these huge data =A0and want to generate reports=
on it after analysis.
Like this I have to generate various reports based on different analysis.
I tried to implement this using Hive. What I did is:
1) I imported all tables in Hive from MS SQL Server using SQOOP.
2) I wrote many queries in Hive which is executing using JDBC on Hive
Thrift Server
3) I am getting the correct result in table form, which I am expecting
4) But the problem is that the time which require to execute is too much long.
=A0 =A0(My complete program is executing in near about 3-4 hours on *small=
amount of data*).

=A0 =A0I decided to do this using Hive.
=A0 =A0 And as I told previously how much time Hive consumed for execution= . my
organization is expecting to complete this task in near about less than
1/2 hours

Now after spending too much time for complete execution for this task what<= br> should I do?
I want to ask one thing that:
*Is this Use Case is possible with Hive?* If possible what should I do in my program to increase the performance?
*And If not possible what is the other good way to implement this Use Case?= *

Please reply me.
Thanks


--
Regards,
Bhavesh Shah



-- Nitin Pawar

--e0cb4efe2ae09389ce04bffa3d20--