Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 10DC510D42 for ; Sat, 14 Sep 2013 00:04:25 +0000 (UTC) Received: (qmail 9939 invoked by uid 500); 14 Sep 2013 00:04:23 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 9881 invoked by uid 500); 14 Sep 2013 00:04:23 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 9873 invoked by uid 99); 14 Sep 2013 00:04:23 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 14 Sep 2013 00:04:23 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of jagatsingh@gmail.com designates 209.85.220.177 as permitted sender) Received: from [209.85.220.177] (HELO mail-vc0-f177.google.com) (209.85.220.177) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 14 Sep 2013 00:04:17 +0000 Received: by mail-vc0-f177.google.com with SMTP id gf12so1459579vcb.36 for ; Fri, 13 Sep 2013 17:03:56 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=KW6U3rkSC+dpHvS9bt8y1zgUSGqRsRfn55+Yp4vpaXk=; b=KZRdeM519MWskw2yAWcl6aeFD77arwktAGmmMLWz3n8B2m/tHvONPtUJo6hRBYkl7d 4MXcw+0CqGnzrD0ryWeGSTwss04QLFFiwT/8r95TlgE56YVaotqwTwcfg+UIHPfUaMj7 6/chkgQoONhKwwZ36hE1+7mXhseO08PCkojAZxOAhYDvckJDFHBbR/dzwitERaxSU/Wy n2xGfAVACqo+M9rjR0VVoMmYMk7ymVJz8Uu8SiQbnpLefGOW9BqZxskxgzJwcMwVOuX2 1Eo45251LRgo3dwubMuu3mk58Ko3LOFq4rvUFtvOavwdYNs3DBta3d3+x5YuXww112CU 3AdQ== MIME-Version: 1.0 X-Received: by 10.52.157.134 with SMTP id wm6mr7917424vdb.26.1379117036720; Fri, 13 Sep 2013 17:03:56 -0700 (PDT) Received: by 10.221.55.202 with HTTP; Fri, 13 Sep 2013 17:03:56 -0700 (PDT) In-Reply-To: References: Date: Sat, 14 Sep 2013 10:03:56 +1000 Message-ID: Subject: Re: question about partition table in hive From: Jagat Singh To: user@hive.apache.org Content-Type: multipart/alternative; boundary=089e0160d0ac13dc4504e64cb595 X-Virus-Checked: Checked by ClamAV on apache.org --089e0160d0ac13dc4504e64cb595 Content-Type: text/plain; charset=ISO-8859-1 Adding to Sanjay's reply The only thing left after flume has added partitions is to tell hive metastore to update partition information. which you can do via add partition command Then you can read data via hive straight away. On Sat, Sep 14, 2013 at 10:00 AM, Sanjay Subramanian < Sanjay.Subramanian@wizecommerce.com> wrote: > A couple of days back, Erik Sammer at the Hadoop Hands On Lab at the > Cloudera Sessions demonstrated how to achieve dynamic partitioning using > Flume and created those partitioned directories on HDFS which are then > readily usable by Hive > > Understanding what I can from the two lines of your mail below, I would > configure Flume to do dynamic partitioning (YEAR, MONTH, DAY, HOUR) and > create those directories in HDFS and then create Hive tables with those > partitions and run the queries > > As Stephen said earlier , experiment like crazy - and share please - it > will make all of us better as well ! > > > Thanks > > sanjay > > From: ch huang > Reply-To: "user@hive.apache.org" > Date: Thursday, September 12, 2013 6:55 PM > To: "user@hive.apache.org" > Subject: question about partition table in hive > > hi,all: > i use flume collect log data and put it in hdfs ,i want to use > hive to do some caculate, query based on timerange,i want to use parttion > table , > but the data file in hdfs is a big file ,how can i put it into pratition > table in hive? > > CONFIDENTIALITY NOTICE > ====================== > This email message and any attachments are for the exclusive use of the > intended recipient(s) and may contain confidential and privileged > information. Any unauthorized review, use, disclosure or distribution is > prohibited. If you are not the intended recipient, please contact the > sender by reply email and destroy all copies of the original message along > with any attachments, from your computer system. If you are the intended > recipient, please be advised that the content of this message is subject to > access, review and disclosure by the sender's Email System Administrator. > --089e0160d0ac13dc4504e64cb595 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Adding to Sanjay's reply

The only thing left after flume has added partitions is to tell hive me= tastore to update partition information.

which you can do via =

add partition command

Then you can read data via hiv= e straight away.


On Sat, Sep 14, 2013 at 10:00 AM, Sanjay Subramanian <Sanjay.Subramanian@wizecommerce.com> wrote:
A couple of days back, Erik Sammer at the Hadoop Hands On Lab at the C= loudera Sessions demonstrated how to achieve dynamic partitioning using Flu= me and created those partitioned directories on HDFS which are then readily= usable by Hive

Understanding what I can from the two lines of your mail below, I woul= d configure Flume to do dynamic partitioning (YEAR, MONTH, DAY, HOUR) and c= reate those directories in HDFS and then create Hive tables with =A0those p= artitions and run the queries

As Stephen said earlier , experiment like crazy - and share please - i= t will make all of us better as well !


Thanks

sanjay

From: ch huang <justlooks@gmail.com>
Reply-To: "user@hive.apache.org" <<= a href=3D"mailto:user@hive.apache.org" target=3D"_blank">user@hive.apache.o= rg>
Date: Thursday, September 12, 2013 = 6:55 PM
To: "user@hive.apache.org" <user@hive.apache.org= >
Subject: question about partition t= able in hive

hi,all:
=A0=A0=A0=A0=A0=A0=A0 i use flume collect log data and put it in hdfs = ,i want to use hive to do some caculate, query based on timerange,i want to= use parttion table ,
but the data file in hdfs is a big file ,how can i put it into pratiti= on table in hive?

CONFIDENTIALITY NOTICE
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
This email message and any attachments are for the exclusive use of the int= ended recipient(s) and may contain confidential and privileged information.= Any unauthorized review, use, disclosure or distribution is prohibited. If= you are not the intended recipient, please contact the sender by reply email and destroy all copies of the ori= ginal message along with any attachments, from your computer system. If you= are the intended recipient, please be advised that the content of this mes= sage is subject to access, review and disclosure by the sender's Email System Administrator.

--089e0160d0ac13dc4504e64cb595--