Return-Path: X-Original-To: apmail-hive-dev-archive@www.apache.org Delivered-To: apmail-hive-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 66B299913 for ; Fri, 3 Feb 2012 05:09:13 +0000 (UTC) Received: (qmail 94417 invoked by uid 500); 3 Feb 2012 05:09:12 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 93975 invoked by uid 500); 3 Feb 2012 05:09:05 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 93953 invoked by uid 99); 3 Feb 2012 05:08:59 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 03 Feb 2012 05:08:59 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of bhavesh25shah@gmail.com designates 209.85.216.41 as permitted sender) Received: from [209.85.216.41] (HELO mail-qw0-f41.google.com) (209.85.216.41) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 03 Feb 2012 05:08:50 +0000 Received: by qadz32 with SMTP id z32so537452qad.14 for ; Thu, 02 Feb 2012 21:08:29 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=eTVVNj4Tggj8eV2UH9u8DD9mSQ1PxG7vT5CsETPDgfk=; b=nLk4UO9LpDBFb4HNQigOCZkJMFcsCGkbf0jj7/iTSDTWC+to8YRPFqt5xr4ZBiVD7l uxYKv+hxJw02135omH9uMiey8u/MPGba3/wvOK0dV5eIfQcKRlx6QMNubLm+IQUYcV1E VL2YpFmz5HpnSnROSGJcy6Q/JCKKOv8JCXaRM= MIME-Version: 1.0 Received: by 10.224.116.211 with SMTP id n19mr1028971qaq.67.1328245709424; Thu, 02 Feb 2012 21:08:29 -0800 (PST) Received: by 10.229.159.9 with HTTP; Thu, 2 Feb 2012 21:08:29 -0800 (PST) In-Reply-To: References: Date: Fri, 3 Feb 2012 10:38:29 +0530 Message-ID: Subject: Re: Table not creating in hive From: Bhavesh Shah To: dev@hive.apache.org, sqoop-user@incubator.apache.org Content-Type: multipart/alternative; boundary=20cf3074d8e6af485204b8084d3e X-Virus-Checked: Checked by ClamAV on apache.org --20cf3074d8e6af485204b8084d3e Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hello Bejoy & Alexis, Thanks for your reply. I am using mysql as a database (and not derby) Previuosly I am using --split by 1 and is working fine, but when I installed MySQL and change the database then I got the error for --split-by option and thats why I use -m 1. But again due to that it is showing that data retrieve is 0. Here are the logs. hadoop@ubuntu:~/sqoop-1.3.0-cdh3u1/bin$ ./sqoop-import --connect 'jdbc:sqlserver://192.168.1.1;username=3Dabcd;password=3D12345;database=3DF= IGMDHadoopTest' --table Appointment --hive-table appointment -m 1 --hive-import --verbose 12/01/31 22:33:40 DEBUG tool.BaseSqoopTool: Enabled debug logging. 12/01/31 22:33:40 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 12/01/31 22:33:40 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Added factory com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by /home/hadoop/sqoop-1.3.0-cdh3u1/conf/managers.d/mssqoop-sqlserver 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Loaded manager factory: com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory 12/01/31 22:33:40 INFO SqlServer.MSSQLServerManagerFactory: Using Microsoft's SQL Server - Hadoop Connector 12/01/31 22:33:40 INFO manager.SqlManager: Using default fetchSize of 1000 12/01/31 22:33:40 DEBUG sqoop.ConnFactory: Instantiated ConnManager com.microsoft.sqoop.SqlServer.MSSQLServerManager@116471f 12/01/31 22:33:40 INFO tool.CodeGenTool: Beginning code generation 12/01/31 22:33:40 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection. 12/01/31 22:33:41 DEBUG manager.SqlManager: Using fetchSize for next query: 1000 12/01/31 22:33:41 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Appointment] 12/01/31 22:33:41 DEBUG manager.SqlManager: Using fetchSize for next query: 1000 12/01/31 22:33:41 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Appointment] 12/01/31 22:33:41 DEBUG orm.ClassWriter: selected columns: 12/01/31 22:33:41 DEBUG orm.ClassWriter: AppointmentUid 12/01/31 22:33:41 DEBUG orm.ClassWriter: ExternalID 12/01/31 22:33:41 DEBUG orm.ClassWriter: PatientUid 12/01/31 22:33:41 DEBUG orm.ClassWriter: StartTime 12/01/31 22:33:41 DEBUG orm.ClassWriter: EndTime 12/01/31 22:33:41 DEBUG orm.ClassWriter: ResourceUid 12/01/31 22:33:41 DEBUG orm.ClassWriter: Note 12/01/31 22:33:41 DEBUG orm.ClassWriter: AppointmentTypeUid 12/01/31 22:33:41 DEBUG orm.ClassWriter: AppointmentStatusUid 12/01/31 22:33:41 DEBUG orm.ClassWriter: CheckOutNote 12/01/31 22:33:41 DEBUG orm.ClassWriter: CreatedDate 12/01/31 22:33:41 DEBUG orm.ClassWriter: CreatedByUid 12/01/31 22:33:41 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.java 12/01/31 22:33:41 DEBUG orm.ClassWriter: Table name: Appointment 12/01/31 22:33:41 DEBUG orm.ClassWriter: Columns: AppointmentUid:1, ExternalID:12, PatientUid:1, StartTime:93, EndTime:93, ResourceUid:1, RenderringProviderUid:1, ReferringProviderUid:1, ServiceLocationUid:1, Note:-1, AppointmentTypeUid:1, AppointmentStatusUid:1, CheckOutNote:12, ROSxml:-1, CreatedDate:93, CreatedByUid:1, ModifiedDate:93, ModifiedByUid:1, SingleDayAppointmentGroupUid:1, MultiDayAppointmentGroupUid:1, 12/01/31 22:33:41 DEBUG orm.ClassWriter: sourceFilename is Appointment.java 12/01/31 22:33:41 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/ 12/01/31 22:33:41 INFO orm.CompilationManager: HADOOP_HOME is /home/hadoop/hadoop-0.20.2-cdh3u2 12/01/31 22:33:41 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.java 12/01/31 22:33:41 DEBUG orm.CompilationManager: Invoking javac with args: 12/01/31 22:33:41 DEBUG orm.CompilationManager: -sourcepath 12/01/31 22:33:41 DEBUG orm.CompilationManager: /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/ 12/01/31 22:33:41 DEBUG orm.CompilationManager: -d 12/01/31 22:33:41 DEBUG orm.CompilationManager: /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/ 12/01/31 22:33:41 DEBUG orm.CompilationManager: -classpath 12/01/31 22:33:41 DEBUG orm.CompilationManager: /home/hadoop/hadoop-0.20.2-cdh3u2//conf:/usr/lib/jvm/java-6-sun-1.6.0.26//l= ib/tools.jar:/home/hadoop/hadoop-0.20.2-cdh3u2/:/home/hadoop/hadoop-0.20.2-= cdh3u2//hadoop-core-0.20.2-cdh3u2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//li= b/ant-contrib-1.0b3.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/aspectjrt-1.= 6.5.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/aspectjtools-1.6.5.jar:/home= /hadoop/hadoop-0.20.2-cdh3u2//lib/commons-cli-1.2.jar:/home/hadoop/hadoop-0= .20.2-cdh3u2//lib/commons-codec-1.4.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//= lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons= -el-1.0.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-httpclient-3.1.j= ar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/commons-logging-1.0.4.jar:/home/h= adoop/hadoop-0.20.2-cdh3u2//lib/commons-logging-api-1.0.4.jar:/home/hadoop/= hadoop-0.20.2-cdh3u2//lib/commons-net-1.4.1.jar:/home/hadoop/hadoop-0.20.2-= cdh3u2//lib/core-3.1.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/hadoop-fa= irscheduler-0.20.2-cdh3u2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/hsqldb= -1.8.0.10.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jackson-core-asl-1.5.2= .jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jackson-mapper-asl-1.5.2.jar:/h= ome/hadoop/hadoop-0.20.2-cdh3u2//lib/jasper-compiler-5.5.12.jar:/home/hadoo= p/hadoop-0.20.2-cdh3u2//lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-0= .20.2-cdh3u2//lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/j= etty-6.1.26.cloudera.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jetty-ser= vlet-tester-6.1.26.cloudera.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/je= tty-util-6.1.26.cloudera.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jsch-= 0.1.42.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/junit-4.5.jar:/home/hadoo= p/hadoop-0.20.2-cdh3u2//lib/kfs-0.2.2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2= //lib/libfb303.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/libthrift.jar:/ho= me/hadoop/hadoop-0.20.2-cdh3u2//lib/log4j-1.2.15.jar:/home/hadoop/hadoop-0.= 20.2-cdh3u2//lib/mockito-all-1.8.2.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//l= ib/oro-2.0.8.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/servlet-api-2.5-200= 81211.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/servlet-api-2.5-6.1.14.jar= :/home/hadoop/hadoop-0.20.2-cdh3u2//lib/slf4j-api-1.4.3.jar:/home/hadoop/ha= doop-0.20.2-cdh3u2//lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-0.20.2-= cdh3u2//lib/xmlenc-0.52.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jsp-2.1/= jsp-2.1.jar:/home/hadoop/hadoop-0.20.2-cdh3u2//lib/jsp-2.1/jsp-api-2.1.jar:= /home/hadoop/sqoop-1.3.0-cdh3u1/conf/::/home/hadoop/sqoop-1.3.0-cdh3u1//lib= /ant-contrib-1.0b3.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/ant-eclipse-1.0= -jvm1.2.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/avro-1.5.1.jar:/home/hadoo= p/sqoop-1.3.0-cdh3u1//lib/avro-ipc-1.5.1.jar:/home/hadoop/sqoop-1.3.0-cdh3u= 1//lib/avro-mapred-1.5.1.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/commons-i= o-1.4.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/hadoop-mrunit-0.20.2-CDH3b2-= SNAPSHOT.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/jackson-core-asl-1.7.3.ja= r:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/jackson-mapper-asl-1.7.3.jar:/home/h= adoop/sqoop-1.3.0-cdh3u1//lib/jopt-simple-3.2.jar:/home/hadoop/sqoop-1.3.0-= cdh3u1//lib/paranamer-2.3.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/snappy-j= ava-1.0.3-rc2.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//lib/sqljdbc4.jar:/home/h= adoop/sqoop-1.3.0-cdh3u1//lib/sqoop-sqlserver-1.0.jar:/home/hadoop/hbase-0.= 90.1-cdh3u0//conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/hadoop/hbase-= 0.90.1-cdh3u0/:/home/hadoop/hbase-0.90.1-cdh3u0//hbase-0.90.1-cdh3u0.jar:/h= ome/hadoop/hbase-0.90.1-cdh3u0//hbase-0.90.1-cdh3u0-tests.jar:/home/hadoop/= hbase-0.90.1-cdh3u0//lib/activation-1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u= 0//lib/asm-3.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/avro-1.3.3.jar:/ho= me/hadoop/hbase-0.90.1-cdh3u0//lib/commons-cli-1.2.jar:/home/hadoop/hbase-0= .90.1-cdh3u0//lib/commons-codec-1.4.jar:/home/hadoop/hbase-0.90.1-cdh3u0//l= ib/commons-el-1.0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-httpcli= ent-3.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/commons-lang-2.5.jar:/hom= e/hadoop/hbase-0.90.1-cdh3u0//lib/commons-logging-1.1.1.jar:/home/hadoop/hb= ase-0.90.1-cdh3u0//lib/commons-net-1.4.1.jar:/home/hadoop/hbase-0.90.1-cdh3= u0//lib/core-3.1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/guava-r06.jar:= /home/hadoop/hbase-0.90.1-cdh3u0//lib/hadoop-core-0.20.2-cdh3u0.jar:/home/h= adoop/hbase-0.90.1-cdh3u0//lib/hbase-0.90.1-cdh3u0.jar:/home/hadoop/hbase-0= .90.1-cdh3u0//lib/jackson-core-asl-1.5.2.jar:/home/hadoop/hbase-0.90.1-cdh3= u0//lib/jackson-jaxrs-1.5.5.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jacks= on-mapper-asl-1.5.2.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jackson-xc-1.= 5.5.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jasper-compiler-5.5.23.jar:/h= ome/hadoop/hbase-0.90.1-cdh3u0//lib/jasper-runtime-5.5.23.jar:/home/hadoop/= hbase-0.90.1-cdh3u0//lib/jaxb-api-2.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0/= /lib/jaxb-impl-2.1.12.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jersey-core= -1.4.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jersey-json-1.4.jar:/home/ha= doop/hbase-0.90.1-cdh3u0//lib/jersey-server-1.4.jar:/home/hadoop/hbase-0.90= .1-cdh3u0//lib/jettison-1.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jetty= -6.1.26.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jetty-util-6.1.26.jar:/ho= me/hadoop/hbase-0.90.1-cdh3u0//lib/jruby-complete-1.0.3.jar:/home/hadoop/hb= ase-0.90.1-cdh3u0//lib/jsp-2.1-6.1.14.jar:/home/hadoop/hbase-0.90.1-cdh3u0/= /lib/jsp-api-2.1-6.1.14.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jsp-api-2= .1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/jsr311-api-1.1.1.jar:/home/had= oop/hbase-0.90.1-cdh3u0//lib/log4j-1.2.16.jar:/home/hadoop/hbase-0.90.1-cdh= 3u0//lib/protobuf-java-2.3.0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/serv= let-api-2.5-6.1.14.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/servlet-api-2.= 5.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/slf4j-api-1.5.8.jar:/home/hadoo= p/hbase-0.90.1-cdh3u0//lib/slf4j-log4j12-1.5.8.jar:/home/hadoop/hbase-0.90.= 1-cdh3u0//lib/stax-api-1.0.1.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/thri= ft-0.2.0.jar:/home/hadoop/hbase-0.90.1-cdh3u0//lib/xmlenc-0.52.jar:/home/ha= doop/hbase-0.90.1-cdh3u0//lib/zookeeper-3.3.3-cdh3u0.jar:/home/hadoop/sqoop= -1.3.0-cdh3u1//sqoop-1.3.0-cdh3u1.jar:/home/hadoop/sqoop-1.3.0-cdh3u1//sqoo= p-test-1.3.0-cdh3u1.jar::/home/hadoop/hadoop-0.20.2-cdh3u2/hadoop-core-0.20= .2-cdh3u2.jar:/home/hadoop/sqoop-1.3.0-cdh3u1/sqoop-1.3.0-cdh3u1.jar 12/01/31 22:33:42 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.java to /home/hadoop/sqoop-1.3.0-cdh3u1/bin/./Appointment.java java.io.IOException: Destination '/home/hadoop/sqoop-1.3.0-cdh3u1/bin/./Appointment.java' already exists at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811) at com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:2= 27) at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83) at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:337) at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:423) at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180) at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219) at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228) at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237) 12/01/31 22:33:42 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.jar 12/01/31 22:33:42 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff 12/01/31 22:33:42 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.clas= s -> Appointment.class 12/01/31 22:33:42 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-hadoop/compile/a7d94d7420001a743a4746242116beff/Appointment.jar 12/01/31 22:33:42 INFO mapreduce.ImportJobBase: Beginning import of Appointment 12/01/31 22:33:42 DEBUG manager.SqlManager: Using fetchSize for next query: 1000 12/01/31 22:33:42 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Appointment] 12/01/31 22:33:42 DEBUG mapreduce.DataDrivenImportJob: Using table class: Appointment 12/01/31 22:33:42 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat: class com.microsoft.sqoop.SqlServer.MSSQLServerDBInputFormat 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/sqoop-1.3.0-cdh3u1.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqljdbc4.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqoop-sqlserver-1.0.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/sqoop-1.3.0-cdh3u1.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/jopt-simple-3.2.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/jackson-mapper-asl-1.7.3.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/snappy-java-1.0.3-rc2.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSH= OT.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/ant-eclipse-1.0-jvm1.2.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqoop-sqlserver-1.0.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/avro-mapred-1.5.1.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/avro-1.5.1.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/avro-ipc-1.5.1.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/paranamer-2.3.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/sqljdbc4.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/jackson-core-asl-1.7.3.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/commons-io-1.4.jar 12/01/31 22:33:42 DEBUG mapreduce.JobBase: Adding to job classpath: file:/home/hadoop/sqoop-1.3.0-cdh3u1/lib/ant-contrib-1.0b3.jar 12/01/31 22:33:43 INFO mapred.JobClient: Running job: job_201201311414_0051 12/01/31 22:33:44 INFO mapred.JobClient: map 0% reduce 0% 12/01/31 22:33:48 INFO mapred.JobClient: map 100% reduce 0% 12/01/31 22:33:48 INFO mapred.JobClient: Job complete: job_201201311414_005= 1 12/01/31 22:33:48 INFO mapred.JobClient: Counters: 11 12/01/31 22:33:48 INFO mapred.JobClient: Job Counters 12/01/31 22:33:48 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=3D4152 12/01/31 22:33:48 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=3D0 12/01/31 22:33:48 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=3D0 12/01/31 22:33:48 INFO mapred.JobClient: Launched map tasks=3D1 12/01/31 22:33:48 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=3D0 12/01/31 22:33:48 INFO mapred.JobClient: FileSystemCounters 12/01/31 22:33:48 INFO mapred.JobClient: HDFS_BYTES_READ=3D87 12/01/31 22:33:48 INFO mapred.JobClient: FILE_BYTES_WRITTEN=3D61985 12/01/31 22:33:48 INFO mapred.JobClient: Map-Reduce Framework 12/01/31 22:33:48 INFO mapred.JobClient: Map input records=3D0 12/01/31 22:33:48 INFO mapred.JobClient: Spilled Records=3D0 12/01/31 22:33:48 INFO mapred.JobClient: Map output records=3D0 12/01/31 22:33:48 INFO mapred.JobClient: SPLIT_RAW_BYTES=3D87 12/01/31 22:33:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 6.2606 seconds (0 bytes/sec) 12/01/31 22:33:48 INFO mapreduce.ImportJobBase: Retrieved 0 records. 12/01/31 22:33:48 INFO hive.HiveImport: Removing temporary files from import process: Appointment/_logs 12/01/31 22:33:48 INFO hive.HiveImport: Loading uploaded data into Hive 12/01/31 22:33:48 DEBUG hive.HiveImport: Hive.inputTable: Appointment 12/01/31 22:33:48 DEBUG hive.HiveImport: Hive.outputTable: appointment 12/01/31 22:33:48 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection. 12/01/31 22:33:48 DEBUG manager.SqlManager: Using fetchSize for next query: 1000 12/01/31 22:33:48 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Appointment] 12/01/31 22:33:48 DEBUG manager.SqlManager: Using fetchSize for next query: 1000 12/01/31 22:33:48 INFO manager.SqlManager: Executing SQL statement: SELECT TOP 1 * FROM [Appointment] 12/01/31 22:33:48 WARN hive.TableDefWriter: Column StartTime had to be cast to a less precise type in Hive 12/01/31 22:33:48 WARN hive.TableDefWriter: Column EndTime had to be cast to a less precise type in Hive 12/01/31 22:33:48 WARN hive.TableDefWriter: Column CreatedDate had to be cast to a less precise type in Hive 12/01/31 22:33:48 WARN hive.TableDefWriter: Column ModifiedDate had to be cast to a less precise type in Hive 12/01/31 22:33:48 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE IF NOT EXISTS `appointment` ( `AppointmentUid` STRING, `ExternalID` STRING, `PatientUid` STRING, `StartTime` STRING, `EndTime` STRING,`Note` STRING, `AppointmentTypeUid` STRING, `AppointmentStatusUid` STRING, `CheckOutNote` STRING, `CreatedDate` STRING, `CreatedByUid` STRING) COMMENT 'Imported by sqoop on 2012/01/31 22:33:48' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE 12/01/31 22:33:48 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://localhost:54310/user/hadoop/Appointment' INTO TABLE `appointment` 12/01/31 22:33:48 DEBUG hive.HiveImport: Using external Hive process. 12/01/31 22:33:50 INFO hive.HiveImport: Hive history file=3D/tmp/hadoop/hive_job_log_hadoop_201201312233_1008229902.txt 12/01/31 22:33:52 INFO hive.HiveImport: OK 12/01/31 22:33:52 INFO hive.HiveImport: Time taken: 2.006 seconds 12/01/31 22:33:53 INFO hive.HiveImport: Loading data to table default.appointment 12/01/31 22:33:53 INFO hive.HiveImport: OK 12/01/31 22:33:53 INFO hive.HiveImport: Time taken: 0.665 seconds 12/01/31 22:33:53 INFO hive.HiveImport: Hive import complete. --=20 Thanks and Regards, Bhavesh Shah On Thu, Feb 2, 2012 at 8:20 PM, Alexis De La Cruz Toledo < alexisdct@gmail.com> wrote: > This is because you need the metastore. > If you aren't installed in a databases, > it installed with derby in the directory when > you access to hive, remember where was it. > There you should find the directory name _metastore > and in this directory access to hive. > > Regards. > > El 2 de febrero de 2012 05:46, Bhavesh Shah >escribi=F3: > > > Hello all, > > > > After successfully importing the tables in hive I am not able to see th= e > > table in Hive. > > When I imported the table I saw the dir on HDFS (under > > /user/hive/warehouse/) but when I execute command in Hive "SHOW TABLES" > > the table is not in the list. > > > > I find a lot about it but not getting anything. > > Pls suggest me some solution for it. > > > > > > > > > > -- > > Thanks and Regards, > > Bhavesh Shah > > > > > > -- > Ing. Alexis de la Cruz Toledo. > *Av. Instituto Polit=E9cnico Nacional No. 2508 Col. San Pedro Zacatenco. > M=E9xico, > D.F, 07360 * > *CINVESTAV, DF.* > --20cf3074d8e6af485204b8084d3e--