Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5919218DB5 for ; Fri, 4 Mar 2016 12:10:51 +0000 (UTC) Received: (qmail 81066 invoked by uid 500); 4 Mar 2016 12:10:49 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 80994 invoked by uid 500); 4 Mar 2016 12:10:49 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 80983 invoked by uid 99); 4 Mar 2016 12:10:49 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 Mar 2016 12:10:49 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id A5AB8C663D for ; Fri, 4 Mar 2016 12:10:48 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.191 X-Spam-Level: * X-Spam-Status: No, score=1.191 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, T_KAM_HTML_FONT_INVALID=0.01, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (1024-bit key) header.d=altiscale.com Received: from mx2-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id c0uyiweAZonZ for ; Fri, 4 Mar 2016 12:10:46 +0000 (UTC) Received: from mail-ob0-f182.google.com (mail-ob0-f182.google.com [209.85.214.182]) by mx2-lw-eu.apache.org (ASF Mail Server at mx2-lw-eu.apache.org) with ESMTPS id F40955F649 for ; Fri, 4 Mar 2016 12:10:45 +0000 (UTC) Received: by mail-ob0-f182.google.com with SMTP id ts10so48287084obc.1 for ; Fri, 04 Mar 2016 04:10:45 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=altiscale.com; s=google; h=mime-version:in-reply-to:references:date:message-id:subject:from:to; bh=5mD6go9fSectiYBP8e8GG4MXEWRxcHk6plUQMeHT03Q=; b=VZJn0ZywzDn3qzhfjs2S5W2sTOtqggUqNNKEnRyIRYGaD4+qyxHz6gW86m0OEzqqNy 8JnVZzkniuTLTXFWAM4DFFTc/tlSeHIoBTTOOD8zDz5ucKAELel/m2QRCm0rIbrzdp1t Cy/8p5kdhpiP+1AHu1v5qeiipKKZlruH/E63o= X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to; bh=5mD6go9fSectiYBP8e8GG4MXEWRxcHk6plUQMeHT03Q=; b=Se4QvsKupP8/0kMtnBsRenuTpVvAmOKTr+z7DKoEsikjg0o/8A7xzqRE6Ytt+AANKD flwXTYyFVb7Vd4ZjMllNELvMQlHWOZ+5w7eSrCGuYrbwO5jJK2ZL25uypp8EDBMzYUnb CHX1JOUk2nL9BAIBT49NGfZZEL/0JOCP9BzdF2k+n9a1ljYZhbpMBd98/vPNwIWJpbiJ ELcszvPetAInF084m+zs5SRbyOVpusSuL1sR7hB1O4doDtgsgRS+sCLhpJf9NM2De7Zm hpDQ6rIHP43L8LD8qlKcFAiC/+V9TC2SFqRogz5lg7cGvGIt20fQh6pMD4X0RDpWsVL/ Nnsg== X-Gm-Message-State: AD7BkJKNcn8x+GxdlfYtDs4756jYn4JZQpyz/UG++ZTLDIsaJoSm2sZIuibPFEDmi3bAeL6EQUFguSJuDxYX+A== MIME-Version: 1.0 X-Received: by 10.182.135.230 with SMTP id pv6mr5571599obb.61.1457093444790; Fri, 04 Mar 2016 04:10:44 -0800 (PST) Received: by 10.76.36.10 with HTTP; Fri, 4 Mar 2016 04:10:44 -0800 (PST) In-Reply-To: References: Date: Fri, 4 Mar 2016 17:40:44 +0530 Message-ID: Subject: Re: Sqoop Hcat Int partition error From: Biswajit Nayak To: user@hive.apache.org Content-Type: multipart/alternative; boundary=089e0112c5cc2e8190052d3801fc --089e0112c5cc2e8190052d3801fc Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Any one has seen this ? On Tue, Mar 1, 2016 at 11:07 AM, Biswajit Nayak wrote: > The fix in the https://issues.apache.org/jira/browse/HIVE-7164. does not > works. > > On Tue, Mar 1, 2016 at 10:51 AM, Richa Sharma > wrote: > >> Great! >> >> So what is the interim fix you are implementing >> >> Richa >> On Mar 1, 2016 4:06 PM, "Biswajit Nayak" wrote: >> >>> Thanks Richa. >>> >>> The issue was suppose to be fixed in Hive 0.12 version as per the jira >>> https://issues.apache.org/jira/browse/HIVE-7164. >>> >>> Even raised a ticket in sqoop jira [SQOOP-2840] for this . >>> >>> Thanks >>> Biswa >>> >>> >>> >>> >>> >>> On Tue, Mar 1, 2016 at 9:56 AM, Richa Sharma < >>> mailtorichasharma@gmail.com> wrote: >>> >>>> Hi, >>>> >>>> The values should still persist if partition column data type in Hive >>>> is a string. >>>> >>>> I am checking HCatalog documentation for support of int data type in >>>> partition column. >>>> >>>> Cheers >>>> Richa >>>> >>>> On Tue, Mar 1, 2016 at 3:06 PM, Biswajit Nayak >>>> wrote: >>>> >>>>> Hi Richa, >>>>> >>>>> Thats a work around. But how to handle the columns with INT type. >>>>> Changing the type will be the last option for me. >>>>> >>>>> Regards >>>>> Biswa >>>>> >>>>> >>>>> >>>>> On Tue, Mar 1, 2016 at 9:31 AM, Richa Sharma < >>>>> mailtorichasharma@gmail.com> wrote: >>>>> >>>>>> Hi Biswajit >>>>>> >>>>>> The answer is in the last line of the error message. Change the data >>>>>> type of partition column to string in hive and try again. >>>>>> >>>>>> Hope it helps ! >>>>>> >>>>>> Richa >>>>>> >>>>>> 16/02/12 08:04:12 ERROR tool.ExportTool: Encountered IOException run= ning export job: java.io.IOException: The table provided default.emp_detail= s1 uses unsupported partitioning key type for column salary : int. Only = string fields are allowed in partition columns in Catalog >>>>>> >>>>>> >>>>>> On Tue, Mar 1, 2016 at 2:19 PM, Biswajit Nayak < >>>>>> biswajit@altiscale.com> wrote: >>>>>> >>>>>>> Hi All, >>>>>>> >>>>>>> I am trying to do a SQOOP export from hive( integer type partition) >>>>>>> to mysql through HCAT and it fails with the following error. >>>>>>> >>>>>>> Versions:- >>>>>>> >>>>>>> Hadoop :- 2.7.1 >>>>>>> Hive :- 1.2.0 >>>>>>> Sqoop :- 1.4.5 >>>>>>> >>>>>>> Table in Hive :- >>>>>>> >>>>>>> >>>>>>> hive> use default; >>>>>>> OK >>>>>>> Time taken: 0.028 seconds >>>>>>> hive> describe emp_details1; >>>>>>> OK >>>>>>> id int >>>>>>> name string >>>>>>> deg string >>>>>>> dept string >>>>>>> salary int >>>>>>> >>>>>>> # Partition Information >>>>>>> # col_name data_type comment >>>>>>> >>>>>>> salary int >>>>>>> Time taken: 0.125 seconds, Fetched: 10 row(s) >>>>>>> hive> >>>>>>> >>>>>>> hive> select * from emp_details1; >>>>>>> OK >>>>>>> 1201 gopal 50000 >>>>>>> 1202 manisha 50000 >>>>>>> 1203 kalil 50000 >>>>>>> 1204 prasanth 50000 >>>>>>> 1205 kranthi 50000 >>>>>>> 1206 satish 50000 >>>>>>> Time taken: 0.195 seconds, Fetched: 6 row(s) >>>>>>> hive> >>>>>>> >>>>>>> >>>>>>> Conf added to Hive metastore site.xml >>>>>>> >>>>>>> >>>>>>> [alti-test-01@hdpnightly271-ci-91-services ~]$ grep -A5 -B2 -i "hiv= e.metastore.integral.jdo.pushdown" /etc/hive-metastore/hive-site.xml >>>>>>> >>>>>>> >>>>>>> hive.metastore.integral.jdo.pushdown >>>>>>> TRUE >>>>>>> >>>>>>> >>>>>>> >>>>>>> [alti-test-01@hdpnightly271-ci-91-services ~]$ >>>>>>> >>>>>>> >>>>>>> The issue remains same >>>>>>> >>>>>>> >>>>>>> [alti-test-01@hdpnightly271-ci-91-services ~]$ /opt/sqoop-1.4.5/bin= /sqoop export --connect jdbc:mysql://localhost:3306/test --username hive --= password ********* --table employee --hcatalog-database default --hcatalog-= table emp_details1 >>>>>>> Warning: /opt/sqoop-1.4.5/bin/../../hbase does not exist! HBase imp= orts will fail. >>>>>>> Please set $HBASE_HOME to the root of your HBase installation. >>>>>>> Warning: /opt/sqoop-1.4.5/bin/../../accumulo does not exist! Accumu= lo imports will fail. >>>>>>> Please set $ACCUMULO_HOME to the root of your Accumulo installation= . >>>>>>> Warning: /opt/sqoop-1.4.5/bin/../../zookeeper does not exist! Accum= ulo imports will fail. >>>>>>> Please set $ZOOKEEPER_HOME to the root of your Zookeeper installati= on. >>>>>>> 16/02/12 08:04:00 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5 >>>>>>> 16/02/12 08:04:00 WARN tool.BaseSqoopTool: Setting your password on= the command-line is insecure. Consider using -P instead. >>>>>>> 16/02/12 08:04:00 INFO manager.MySQLManager: Preparing to use a MyS= QL streaming resultset. >>>>>>> 16/02/12 08:04:00 INFO tool.CodeGenTool: Beginning code generation >>>>>>> 16/02/12 08:04:01 INFO manager.SqlManager: Executing SQL statement:= SELECT t.* FROM `employee` AS t LIMIT 1 >>>>>>> 16/02/12 08:04:01 INFO manager.SqlManager: Executing SQL statement:= SELECT t.* FROM `employee` AS t LIMIT 1 >>>>>>> 16/02/12 08:04:01 INFO orm.CompilationManager: HADOOP_MAPRED_HOME i= s /opt/hadoop >>>>>>> Note: /tmp/sqoop-alti-test-01/compile/1b0d4b1c30f167eb57ef488232ab4= 9c8/employee.java uses or overrides a deprecated API. >>>>>>> Note: Recompile with -Xlint:deprecation for details. >>>>>>> 16/02/12 08:04:07 INFO orm.CompilationManager: Writing jar file: /t= mp/sqoop-alti-test-01/compile/1b0d4b1c30f167eb57ef488232ab49c8/employee.jar >>>>>>> 16/02/12 08:04:07 INFO mapreduce.ExportJobBase: Beginning export of= employee >>>>>>> 16/02/12 08:04:08 INFO mapreduce.ExportJobBase: Configuring HCatalo= g for export job >>>>>>> 16/02/12 08:04:08 INFO hcat.SqoopHCatUtilities: Configuring HCatalo= g specific details for job >>>>>>> 16/02/12 08:04:08 INFO manager.SqlManager: Executing SQL statement:= SELECT t.* FROM `employee` AS t LIMIT 1 >>>>>>> 16/02/12 08:04:08 INFO hcat.SqoopHCatUtilities: Database column nam= es projected : [id, name, deg, salary, dept] >>>>>>> 16/02/12 08:04:08 INFO hcat.SqoopHCatUtilities: Database column nam= e - info map : >>>>>>> id : [Type : 4,Precision : 11,Scale : 0] >>>>>>> name : [Type : 12,Precision : 20,Scale : 0] >>>>>>> deg : [Type : 12,Precision : 20,Scale : 0] >>>>>>> salary : [Type : 4,Precision : 11,Scale : 0] >>>>>>> dept : [Type : 12,Precision : 10,Scale : 0] >>>>>>> >>>>>>> 16/02/12 08:04:10 INFO hive.metastore: Trying to connect to metasto= re with URI thrift://hive-hdpnightly271-ci-91.test.altiscale.com:9083 >>>>>>> 16/02/12 08:04:10 INFO hive.metastore: Connected to metastore. >>>>>>> 16/02/12 08:04:11 INFO hcat.SqoopHCatUtilities: HCatalog full table= schema fields =3D [id, name, deg, dept, salary] >>>>>>> 16/02/12 08:04:12 ERROR tool.ExportTool: Encountered IOException ru= nning export job: java.io.IOException: The table provided default.emp_detai= ls1 uses unsupported partitioning key type for column salary : int. Only= string fields are allowed in partition columns in Catalog >>>>>>> >>>>>>> >>>>>>> Stuck with this issue. Any one had conquered this before. >>>>>>> >>>>>>> Regards >>>>>>> Biswa >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> > --089e0112c5cc2e8190052d3801fc Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Any one has seen this ?
On Tue, Mar 1, 2016 at 11:07 AM, Biswajit Naya= k <biswajit@altiscale.com> wrote:
The fix in the=C2=A0https://issues.apache.org/jira/browse/HIVE-7164. =C2=A0does not works.=C2=A0
=

On Tue, Mar 1, 2016 at 10:51 AM, Richa Sharma <mailtorichasharm= a@gmail.com> wrote:

Great!

So what is the interim fix you are implementing

Richa

On Mar 1, 2016 4:06 PM, "Biswajit Nayak&quo= t; <biswajit= @altiscale.com> wrote:
Thanks Richa.

The issue was s= uppose to be fixed in Hive 0.12 version as per the jira https://issues.a= pache.org/jira/browse/HIVE-7164.=C2=A0

Even ra= ised a ticket in sqoop jira=C2=A0[SQOOP-2840]=C2=A0for this .=C2=A0

T= hanks
Biswa





On Tue, Mar 1, 2016 at 9:56 AM, Richa Sharma <mailtorichasharma= @gmail.com> wrote:
Hi,

The values should still persist if partition colu= mn data type in Hive is a string.

I am checking HC= atalog documentation for support of int data type in partition column.

Cheers
R= icha

On Tue, Mar 1, 2016 at 3:06 PM, Biswajit Nayak <biswajit@altiscale.com> wrote:
Hi Richa,=C2=A0

Thats a work arou= nd. But how to handle the columns with INT type. Changing the type will be = the last option for me.=C2=A0

Regards
Bi= swa



On Tue, Mar 1, 2016 at 9:31 AM, Richa= Sharma <mailtorichasharma@gmail.com> wrote:
Hi Biswajit

The answer is in the last line of the error message. Change the data type= of partition column to string in hive and try again.

<= div>Hope it helps !

= Richa
16/02/12 08:04:12 ERROR tool.ExportTool: Encountered IOExcepti=
on running export job: java.io.IOException: The table provided default.emp_=
details1 uses unsupported  partitioning key type  for column salary : int. =
 Only string fields are allowed in partition columns in Catalog

On Tue, Mar 1, 2016 at 2:19 PM, Biswajit Nayak <biswajit= @altiscale.com> wrote:
Hi All,=C2=A0

I am trying= to do a SQOOP export from hive( integer type partition) to mysql through H= CAT and it fails with the following error.=C2=A0

Versions:-

H= adoop :- =C2=A02.7.1
Hive =C2=A0 =C2=A0 =C2=A0:- =C2=A01.2.0
Sqoop =C2=A0 :- =C2=A01.4.5

Table in Hive :-


hive> use default;
OK
Time taken: 0.028 seconds
hive> describe emp_details1;
OK
id                      int                                        =20
name                    string                                     =20
deg                     string                                     =20
dept                    string                                     =20
salary                  int                                        =20

# Partition Information     =20
# col_name              data_type               comment            =20

salary                  int                                        =20
Time taken: 0.125 seconds, Fetched: 10 row(s)
hive>=20

hive> select * from emp_details1;
OK
1201    gopal           50000
1202    manisha         50000
1203    kalil           50000
1204    prasanth        50000
1205    kranthi         50000
1206    satish          50000
Time taken: 0.195 seconds, Fetched: 6 row(s)
hive>=20

= Conf added to Hive metastore site.xml


[alti-test-01@hdpnightly271-ci-91-services ~]$ grep -A5 -B2 -i "hive.m=
etastore.integral.jdo.pushdown" /etc/hive-metastore/hive-site.xml=20
    </property>
    <property>
        <name>hive.metastore.integral.jdo.pushdown</name>
        <value>TRUE</value>
    </property>

</configuration>
[alti-test-01@hdpnightly271-ci-91-services ~]$=20

= The issue remains same


[alti-test-01@hdpnightly271-ci-91-services ~]$ /opt/sqoop-1.4.5/bin/sqoop e=
xport --connect jdbc:mysql://localhost:3306/test --username hive --password=
 ********* --table employee --hcatalog-database default --hcatalog-table em=
p_details1
Warning: /opt/sqoop-1.4.5/bin/../../hbase does not exist! HBase imports wil=
l fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /opt/sqoop-1.4.5/bin/../../accumulo does not exist! Accumulo impor=
ts will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /opt/sqoop-1.4.5/bin/../../zookeeper does not exist! Accumulo impo=
rts will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/02/12 08:04:00 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5
16/02/12 08:04:00 WARN tool.BaseSqoopTool: Setting your password on the com=
mand-line is insecure. Consider using -P instead.
16/02/12 08:04:00 INFO manager.MySQLManager: Preparing to use a MySQL strea=
ming resultset.
16/02/12 08:04:00 INFO tool.CodeGenTool: Beginning code generation
16/02/12 08:04:01 INFO manager.SqlManager: Executing SQL statement: SELECT =
t.* FROM `employee` AS t LIMIT 1
16/02/12 08:04:01 INFO manager.SqlManager: Executing SQL statement: SELECT =
t.* FROM `employee` AS t LIMIT 1
16/02/12 08:04:01 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/h=
adoop
Note: /tmp/sqoop-alti-test-01/compile/1b0d4b1c30f167eb57ef488232ab49c8/empl=
oyee.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/02/12 08:04:07 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop=
-alti-test-01/compile/1b0d4b1c30f167eb57ef488232ab49c8/employee.jar
16/02/12 08:04:07 INFO mapreduce.ExportJobBase: Beginning export of employe=
e
16/02/12 08:04:08 INFO mapreduce.ExportJobBase: Configuring HCatalog for ex=
port job
16/02/12 08:04:08 INFO hcat.SqoopHCatUtilities: Configuring HCatalog specif=
ic details for job
16/02/12 08:04:08 INFO manager.SqlManager: Executing SQL statement: SELECT =
t.* FROM `employee` AS t LIMIT 1
16/02/12 08:04:08 INFO hcat.SqoopHCatUtilities: Database column names proje=
cted : [id, name, deg, salary, dept]
16/02/12 08:04:08 INFO hcat.SqoopHCatUtilities: Database column name - info=
 map :
    id : [Type : 4,Precision : 11,Scale : 0]
    name : [Type : 12,Precision : 20,Scale : 0]
    deg : [Type : 12,Precision : 20,Scale : 0]
    salary : [Type : 4,Precision : 11,Scale : 0]
    dept : [Type : 12,Precision : 10,Scale : 0]

16/02/12 08:04:10 INFO hive.metastore: Trying to connect to metastore with =
URI thrift://hive-hdpnightly271-ci-91.test.altiscale.com:9083
16/02/12 08:04:10 INFO hive.metastore: Connected to metastore.
16/02/12 08:04:11 INFO hcat.SqoopHCatUtilities: HCatalog full table schema =
fields =3D [id, name, deg, dept, salary]
16/02/12 08:04:12 ERROR tool.ExportTool: Encountered IOException running ex=
port job: java.io.IOException: The table provided default.emp_details1 uses=
 unsupported  partitioning key type  for column salary : int.  Only string =
fields are allowed in partition columns in Catalog









--089e0112c5cc2e8190052d3801fc--