Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D6435DE2A for ; Thu, 5 Jul 2012 11:34:09 +0000 (UTC) Received: (qmail 83436 invoked by uid 500); 5 Jul 2012 11:34:08 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 83109 invoked by uid 500); 5 Jul 2012 11:34:06 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 83079 invoked by uid 99); 5 Jul 2012 11:34:05 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 05 Jul 2012 11:34:05 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: local policy) Received: from [203.91.198.75] (HELO wipro-blr-out02.wipro.com) (203.91.198.75) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 05 Jul 2012 11:33:57 +0000 X-AuditID: cb5bdd58-b7be5ae00000636e-38-4ff57b8ec14d Received: from BLR-OUT-EDG01.wipro.com ( [203.91.193.31]) (using TLS with cipher AES128-SHA (AES128-SHA/128 bits)) (Client did not present a certificate) by wipro-blr-out02.wipro.com (Symantec Mail Security) with SMTP id 9C.5E.25454.E8B75FF4; Thu, 5 Jul 2012 17:03:35 +0530 (IST) Received: from BLR-EC-MBX2.wipro.com (10.208.51.112) by BLR-OUT-EDG01.wipro.com (203.91.193.31) with Microsoft SMTP Server (TLS) id 14.1.289.1; Thu, 5 Jul 2012 17:05:10 +0530 Received: from BLR-EC-MBX7.wipro.com ([169.254.7.229]) by BLR-EC-MBX2.wipro.com ([169.254.2.121]) with mapi id 14.01.0289.001; Thu, 5 Jul 2012 17:03:33 +0530 From: To: , Subject: RE: Hive uploading Thread-Topic: Hive uploading Thread-Index: Ac1akCWizri2qGFZRRmqC5duGT1hQP//qRkAgABoKi0= Date: Thu, 5 Jul 2012 11:33:33 +0000 Message-ID: <2ADA1B0170E3434DA763D609DCD01EB541DAA845@BLR-EC-MBX7.wipro.com> References: <2ADA1B0170E3434DA763D609DCD01EB541DAA829@BLR-EC-MBX7.wipro.com>,<1341481480.3213.YahooMailNeo@web121206.mail.ne1.yahoo.com> In-Reply-To: <1341481480.3213.YahooMailNeo@web121206.mail.ne1.yahoo.com> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [10.203.33.76] Content-Type: multipart/alternative; boundary="_000_2ADA1B0170E3434DA763D609DCD01EB541DAA845BLRECMBX7wiproc_" MIME-Version: 1.0 X-Brightmail-Tracker: AAAAAQAAAZE= --_000_2ADA1B0170E3434DA763D609DCD01EB541DAA845BLRECMBX7wiproc_ Content-Type: text/plain; charset="iso-8859-1" content-transfer-encoding: quoted-printable Hi Bejoy I have confirmed hive installation its same for both I used command echo $HIVE_HOME on both sqoop terminal and hive terminal both result the same Path HADOOP/hive I am new to Hive and sqoop, would you please give an example using -verbose= option with this command sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 -= -password SQOOP1 -table Dummy --hive-table dummyhive --create-hive-table -= -hive-import --hive-home HADOOP/hive Please help ________________________________ From: Bejoy Ks [bejoy_ks@yahoo.com] Sent: Thursday, July 05, 2012 3:14 PM To: user@hive.apache.org Subject: Re: Hive uploading Hi Yogesh No issues seen on the first look. Can you run the sqoop import with --verbos= e option and post in the console dump? Are you having multiple hive installation? If so please verify whether you a= re using the same hive for both SQOOP import and then for verifying data usi= ng hive cli. (the hive installation @ HADOOP/hive) Regards Bejoy KS ________________________________ From: "yogesh.kumar13@wipro.com" To: user@hive.apache.org Sent: Thursday, July 5, 2012 2:58 PM Subject: Hive uploading Hi I have created a table in Mysql by name Dummy and it has 2 columns, and 1 ro= w of data I want to upload that table into Hive using Sqoop tool. I used this command sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --= password SQOOP1 -table Dummy --hive-table dummyhive --create-hive-table --= hive-import --hive-home HADOOP/hive The table has been succesfully uploaded into HDFS /user/hive/warehouse but when I run command in Hive Show Tables; I don't find dummyhive table in it. Please suggest and Help Details of the command and output mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://loca= lhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-tabl= e dummyhive --create-hive-table --hive-import --hive-home HADOOP/hive 12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the comm= and-line is insecure. Consider using -P instead. 12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters fo= r output. You can override 12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminat= ed-by, etc. 12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL stream= ing resultset. 12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation 12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t= .* FROM `Dummy` AS t LIMIT 1 12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop= -0.20.2/bin/.. 12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HA= DOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.j= ava uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-= mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar 12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing= from mysql. 12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Us= e the --direct 12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-spec= ific fast path. 12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior= to convertToNull (mysql) 12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy 12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT= MIN(`Sno`), MAX(`Sno`) FROM `Dummy` 12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001 12/07/05 11:09:19 INFO mapred.JobClient: map 0% reduce 0% 12/07/05 11:09:33 INFO mapred.JobClient: map 100% reduce 0% 12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001 12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5 12/07/05 11:09:35 INFO mapred.JobClient: Job Counters 12/07/05 11:09:35 INFO mapred.JobClient: Launched map tasks=3D1 12/07/05 11:09:35 INFO mapred.JobClient: FileSystemCounters 12/07/05 11:09:35 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=3D8 12/07/05 11:09:35 INFO mapred.JobClient: Map-Reduce Framework 12/07/05 11:09:35 INFO mapred.JobClient: Map input records=3D1 12/07/05 11:09:35 INFO mapred.JobClient: Spilled Records=3D0 12/07/05 11:09:35 INFO mapred.JobClient: Map output records=3D1 12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.94= 5 seconds (0.4458 bytes/sec) 12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records. 12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import= process: Dummy/_logs 12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive 12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t= .* FROM `Dummy` AS t LIMIT 1 12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configurat= ion in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.propertie= s 12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=3D/tmp/mediaadmin/= hive_job_log_mediaadmin_201207051109_1901926452.txt 12/07/05 11:09:41 INFO hive.HiveImport: OK 12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds 12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyh= ive 12/07/05 11:09:41 INFO hive.HiveImport: OK 12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds 12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete. Why is it so? Please help me out Thanks & Regards Yogesh Kumar Please do not print this email unless it is absolutely necessary. The information contained in this electronic message and any attachments to= this message are intended for the exclusive use of the addressee(s) and may= contain proprietary, confidential or privileged information. If you are not= the intended recipient, you should not disseminate, distribute or copy this= e-mail. Please notify the sender immediately and destroy all copies of this= message and any attachments. WARNING: Computer viruses can be transmitted via email. The recipient should= check this email and any attachments for the presence of viruses. The compa= ny accepts no liability for any damage caused by any virus transmitted by th= is email. www.wipro.com Please do not print this email unless it is absolutely necessary. =0A= =0A= The information contained in this electronic message and any attachments to= this message are intended for the exclusive use of the addressee(s) and may= contain proprietary, confidential or privileged information. If you are not= the intended recipient, you should not disseminate, distribute or copy this= e-mail. Please notify the sender immediately and destroy all copies of this= message and any attachments. =0A= =0A= WARNING: Computer viruses can be transmitted via email. The recipient should= check this email and any attachments for the presence of viruses. The compa= ny accepts no liability for any damage caused by any virus transmitted by th= is email. =0A= =0A= www.wipro.com --_000_2ADA1B0170E3434DA763D609DCD01EB541DAA845BLRECMBX7wiproc_ Content-Type: text/html; charset="iso-8859-1" content-transfer-encoding: quoted-printable
Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose= option with this command


 sqoop import --connect jdbc:mysql:/= /localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive= -table dummyhive  --create-hive-table  --hive-import --hive-home H= ADOOP/hive



Please help



From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

No issues seen on the first look. Can you run the sqoop import wi= th --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify wh= ether you are using the same hive for both SQOOP import and then for verifyi= ng data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS


From: "yogesh.kumar13@= wipro.com" <yogesh.kumar13@wipro.com>
To: user@hive.apache.org Sent: Thursday, July 5, 201= 2 2:58 PM
Subject: Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 ro= w of data

I want to upload that table into Hive using Sqoop tool.
I used this command 


sqoop import --connect jdbc:mysql://local= host:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table= dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/= hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse=
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop= import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --passw= ord SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table&nbs= p; --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the comm= and-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters fo= r output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminat= ed-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL stream= ing resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t= .* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop= -0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HA= DOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.j= ava uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-= mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing= from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Us= e the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-spec= ific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior= to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy 12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT= MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001<= br> 12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001=
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched ma= p tasks=3D1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_= WRITTEN=3D8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework 12/07/05 11:09:35 INFO mapred.JobClient:     Map input r= ecords=3D1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Rec= ords=3D0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output= records=3D1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.94= 5 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import= process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t= .* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configurat= ion in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.propertie= s
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=3D/tmp/mediaadmin/= hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyh= ive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar

Please do not print this e= mail unless it is absolutely necessary.
The information contained in this electronic message and any attachment= s to this message are intended for the exclusive use of the addressee(s) and= may contain proprietary, confidential or privileged information. If you are= not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify t= he sender immediately and destroy all copies of this message and any attachm= ents.
WARNING: Computer viruses can be transmitted via email. The recipient s= hould check this email and any attachments for the presence of viruses. The= company accepts no liability for any damage caused by any virus transmitted= by this email.
www.wipro.com


Please do not print this email unl= ess it is absolutely necessary.

=0A= =0A= =0A=

The information contained in this electronic message and any attachments= to this message are intended for the exclusive use of the addressee(s) and= may contain proprietary, confidential or privileged information. If you are= not the intended recipient, you should not disseminate, distribute or copy= this e-mail. Please notify the sender immediately and destroy all copies of= this message and any attachments.

=0A= =0A=

WARNING: Computer viruses can be transmitted via email. The recipient sho= uld check this email and any attachments for the presence of viruses. The co= mpany accepts no liability for any damage caused by any virus transmitted by= this email.

=0A=

=0A= www.wipro.com=0A=

--_000_2ADA1B0170E3434DA763D609DCD01EB541DAA845BLRECMBX7wiproc_--