Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1EAC3DC52 for ; Tue, 5 Mar 2013 13:59:37 +0000 (UTC) Received: (qmail 92458 invoked by uid 500); 5 Mar 2013 13:59:35 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 92249 invoked by uid 500); 5 Mar 2013 13:59:35 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 92235 invoked by uid 99); 5 Mar 2013 13:59:35 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Mar 2013 13:59:34 +0000 X-ASF-Spam-Status: No, hits=1.8 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of abygaikwad17@gmail.com designates 209.85.216.50 as permitted sender) Received: from [209.85.216.50] (HELO mail-qa0-f50.google.com) (209.85.216.50) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 05 Mar 2013 13:59:30 +0000 Received: by mail-qa0-f50.google.com with SMTP id dx4so1902768qab.9 for ; Tue, 05 Mar 2013 05:59:09 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=3ufq4Sq0XDpv4GA9HIV/Skzk8gfffSBnp4CtN8dSc5M=; b=goU/EuzCBSzK3JCuRvxORZ/R8MJuPZRfrYTQ0mcvPR87o80aF8w3NEI1hyVEpaYMJL SNFGz/Wj8gEf1jNzdQ/eHjNh57FY3Bh0I/GDSx50tCyrV8nYA8Izv+0G0IgIBCJzkH4/ e37awk/irWut3b34p//uqak7mzwrfZ/aN43Wn4rlLCosPy8MqMCkuHZeIuCci6X75nfP E349gRsoqM1lg8Wlur2YvLc9H/nLEuK2iTGrixKNVS5yVS7+M5RSKF5PDkCxKEAS7+Uy v6zqNSuErq/ZRkx2IiAuRN9l8t2RnggXY17Bu5XOOFWrV/hZsBwSMXq0bnN31iHqYS84 pHkA== MIME-Version: 1.0 X-Received: by 10.49.17.198 with SMTP id q6mr36319722qed.40.1362491949554; Tue, 05 Mar 2013 05:59:09 -0800 (PST) Received: by 10.49.4.167 with HTTP; Tue, 5 Mar 2013 05:59:09 -0800 (PST) In-Reply-To: <397522F5C59A004E9C0681FB9F18B2E20715C163@NDA-HCLT-MBS05.hclt.corp.hcl.in> References: <397522F5C59A004E9C0681FB9F18B2E20715C163@NDA-HCLT-MBS05.hclt.corp.hcl.in> Date: Tue, 5 Mar 2013 19:29:09 +0530 Message-ID: Subject: Re: Error while exporting table data from hive to Oracle through Sqoop From: abhijeet gaikwad To: user@hive.apache.org, user@sqoop.apache.org Content-Type: multipart/alternative; boundary=047d7bdc8d24a9a36f04d72de0ee X-Virus-Checked: Checked by ClamAV on apache.org --047d7bdc8d24a9a36f04d72de0ee Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable + sqoop user The answer is in your exception! Check your data, your hitting unique key violation. Thanks, Abhijeet On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava < Ajit.Shreevastava@hcl.com> wrote: > Hi All,**** > > ** ** > > I am facing following issue while exporting table from hive to Oracle. > Importing table from Oracle to Hive and HDFS is working fine. Please let = me > know where I lag. I am pasting my screen output here.**** > > ** ** > > ** ** > > *[hadoop@NHCLT-PC44-2 sqoop-oper]$ sqoop export --connect > jdbc:oracle:thin:@10.99.42.11:1521/clouddb --username HDFSUSER --table > BTTN_BKP --export-dir /home/hadoop/user/hive/warehouse/bttn -P --verbos= e > -m 1 --input-fields-terminated-by '\001'* > > Warning: /usr/lib/hbase does not exist! HBase imports will fail.**** > > Please set $HBASE_HOME to the root of your HBase installation.**** > > 13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled debug logging.**** > > Enter password:**** > > 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded manager factory: > com.cloudera.sqoop.manager.DefaultManagerFactory**** > > 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying ManagerFactory: > com.cloudera.sqoop.manager.DefaultManagerFactory**** > > 13/03/05 19:20:16 DEBUG manager.DefaultManagerFactory: Trying with scheme= : > jdbc:oracle:thin:@10.99.42.11**** > > 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Instantiated new > connection cache.**** > > 13/03/05 19:20:16 INFO manager.SqlManager: Using default fetchSize of 100= 0 > **** > > 13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantiated ConnManager > org.apache.sqoop.manager.OracleManager@2abe0e27**** > > 13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning code generation**** > > 13/03/05 19:20:16 DEBUG manager.OracleManager: Using column names query: > SELECT t.* FROM BTTN_BKP t WHERE 1=3D0**** > > 13/03/05 19:20:16 DEBUG manager.OracleManager: Creating a new connection > for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER*= * > ** > > 13/03/05 19:20:16 DEBUG manager.OracleManager: No connection paramenters > specified. Using regular API for making connection.**** > > 13/03/05 19:20:16 INFO manager.OracleManager: Time zone has been set to G= MT > **** > > 13/03/05 19:20:16 DEBUG manager.SqlManager: Using fetchSize for next > query: 1000**** > > 13/03/05 19:20:16 INFO manager.SqlManager: Executing SQL statement: SELEC= T > t.* FROM BTTN_BKP t WHERE 1=3D0**** > > 13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCache: Caching released > connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: selected columns:**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BTTN_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: DATA_INST_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: SCR_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BTTN_NU**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: CAT**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: WDTH**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: HGHT**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: KEY_SCAN**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: KEY_SHFT**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: FRGND_CPTN_COLR**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: FRGND_CPTN_COLR_PRSD**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BKGD_CPTN_COLR**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BKGD_CPTN_COLR_PRSD**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BLM_FL**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: LCLZ_FL**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: MENU_ITEM_NU**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BTTN_ASGN_LVL_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: ON_ATVT**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: ON_CLIK**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: ENBL_FL**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BLM_SET_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BTTN_ASGN_LVL_NAME**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: MKT_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: CRTE_TS**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: CRTE_USER_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: UPDT_TS**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: UPDT_USER_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: DEL_TS**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: DEL_USER_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: DLTD_FL**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: MENU_ITEM_NA**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: PRD_CD**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BLM_SET_NA**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: SOUND_FILE_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: IS_DYNMC_BTTN**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: FRGND_CPTN_COLR_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: FRGND_CPTN_COLR_PRSD_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BKGD_CPTN_COLR_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: BKGD_CPTN_COLR_PRSD_ID**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing source file: > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java*= * > ** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name: BTTN_BKP**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BTTN_ID:2, > DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SCAN:2, > KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_COLR:1= 2, > BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2, > BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_FL:2, BLM_SET_ID:2, > BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_ID:12, UPDT_TS:93, > UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, MENU_ITEM_NA:12, > PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, > FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PRSD_ID:2, BKGD_CPTN_COLR_ID:2, > BKGD_CPTN_COLR_PRSD_ID:2,**** > > 13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilename is BTTN_BKP.java*= * > ** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: Found existing > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/**** > > 13/03/05 19:20:16 INFO orm.CompilationManager: HADOOP_HOME is > /home/hadoop/hadoop-1.0.3/libexec/..**** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: Adding source file: > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java*= * > ** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: Invoking javac with args:= * > *** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: -sourcepath**** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/**** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: -d**** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/**** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: -classpath**** > > 13/03/05 19:20:16 DEBUG orm.CompilationManager: > /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/tools= .jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libexec= /../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.2.= jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/hado= op/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop-1= .0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3/l= ibexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/li= bexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/c= ommons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-colle= ctions-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configura= tion-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.= jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/home= /hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-= 1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0.3= /libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib= /commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logg= ing-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-api-= 1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/ho= me/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/ha= doop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/= ../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexe= c/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/.= ./lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/hs= qldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-asl= -1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.= 8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/= home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/had= oop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/libe= xec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jer= sey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1.8= .jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop= /hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/lib= exec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/= jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/hom= e/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.0.= 3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/= mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.ja= r:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/ho= me/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hado= op-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3/l= ibexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-= 2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.= 1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar= :/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib/a= vro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoop/= lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home/h= adoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-asl= -1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hadoop= /sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/hado= op/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.j= ar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2.j= ar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sqoo= p-1.4.2.jar > **** > > Note: > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.java > uses or overrides a deprecated API.**** > > Note: Recompile with -Xlint:deprecation for details.**** > > 13/03/05 19:20:18 INFO orm.CompilationManager: Writing jar file: > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar**= * > * > > 13/03/05 19:20:18 DEBUG orm.CompilationManager: Scanning for .class files > in directory: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e*= * > ** > > 13/03/05 19:20:18 DEBUG orm.CompilationManager: Got classfile: > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.class > -> BTTN_BKP.class**** > > 13/03/05 19:20:18 DEBUG orm.CompilationManager: Finished writing jar file > /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_BKP.jar**= * > * > > 13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Beginning export of > BTTN_BKP**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using InputFormat: class > org.apache.sqoop.mapreduce.ExportInputFormat**** > > 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Got cached > connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER**** > > 13/03/05 19:20:18 INFO manager.OracleManager: Time zone has been set to G= MT > **** > > 13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCache: Caching released > connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb/HDFSUSER**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/sqoop-1.4.2.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/ojdbc6.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/sqoop-1.4.2.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/sqoop-1.4.2.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/ojdbc6.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/paranamer-2.3.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/avro-1.5.3.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/commons-io-1.4.jar**** > > 13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to job classpath: > file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar**** > > 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to proces= s > : 4**** > > 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Target numMapTasks= =3D1* > *** > > 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Total input > bytes=3D184266237**** > > 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: maxSplitSize=3D18426= 6237 > **** > > 13/03/05 19:20:19 INFO input.FileInputFormat: Total input paths to proces= s > : 4**** > > 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: Generated splits:***= * > > 13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat: > Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,/home= /hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/user/= hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/hive/w= arehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse/bttn= /part-m-00003:0+44751078 > Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;**** > > 13/03/05 19:20:19 INFO mapred.JobClient: Running job: job_201303051835_00= 10 > **** > > 13/03/05 19:20:20 INFO mapred.JobClient: map 0% reduce 0%**** > > 13/03/05 19:20:36 INFO mapred.JobClient: map 7% reduce 0%**** > > 13/03/05 19:20:39 INFO mapred.JobClient: map 11% reduce 0%**** > > 13/03/05 19:20:42 INFO mapred.JobClient: map 16% reduce 0%**** > > 13/03/05 19:20:45 INFO mapred.JobClient: map 17% reduce 0%**** > > 13/03/05 19:20:48 INFO mapred.JobClient: map 20% reduce 0%**** > > 13/03/05 19:20:51 INFO mapred.JobClient: map 27% reduce 0%**** > > 13/03/05 19:20:54 INFO mapred.JobClient: map 32% reduce 0%**** > > 13/03/05 19:20:57 INFO mapred.JobClient: map 33% reduce 0%**** > > 13/03/05 19:21:01 INFO mapred.JobClient: map 38% reduce 0%**** > > 13/03/05 19:21:04 INFO mapred.JobClient: map 39% reduce 0%**** > > 13/03/05 19:21:07 INFO mapred.JobClient: map 43% reduce 0%**** > > 13/03/05 19:21:10 INFO mapred.JobClient: map 44% reduce 0%**** > > 13/03/05 19:21:13 INFO mapred.JobClient: map 48% reduce 0%**** > > 13/03/05 19:21:18 INFO mapred.JobClient: Task Id : > attempt_201303051835_0010_m_000000_0, Status : FAILED**** > > java.util.NoSuchElementException**** > > at java.util.AbstractList$Itr.next(AbstractList.java:350)**** > > at BTTN_BKP.__loadFromFields(BTTN_BKP.java:1349)**** > > at BTTN_BKP.parse(BTTN_BKP.java:1148)**** > > at > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)= * > *** > > at > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)= * > *** > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)**** > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java= :182) > **** > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764= ) > **** > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)**** > > at org.apache.hadoop.mapred.Child$4.run(Child.java:255)**** > > at java.security.AccessController.doPrivileged(Native Method)**** > > at javax.security.auth.Subject.doAs(Subject.java:396)**** > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1121) > **** > > at org.apache.hadoop.mapred.Child.main(Child.java:249)**** > > ** ** > > 13/03/05 19:21:19 INFO mapred.JobClient: map 0% reduce 0%**** > > 13/03/05 19:21:27 INFO mapred.JobClient: Task Id : > attempt_201303051835_0010_m_000000_1, Status : FAILED**** > > java.io.IOException: java.sql.BatchUpdateException: ORA-00001: unique > constraint (HDFSUSER.BTTN_BKP_PK) violated**** > > ** ** > > at > org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWrite= r.java:220) > **** > > at > org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWrite= r.java:46) > **** > > at > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.j= ava:639) > **** > > at > org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputC= ontext.java:80) > **** > > at > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)= * > *** > > at > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)= * > *** > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)**** > > at > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java= :182) > **** > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764= ) > **** > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)**** > > at org.apache.hadoop.mapred.Child$4.run(Child.java:255)**** > > at java.security.AccessController.doPrivileged(Native Method)**** > > at javax.security.auth.Subject.doAs(Subject.java:396)**** > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1121) > **** > > at org.apache.hadoop.mapred.Child.main(Child.java:249)**** > > Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint > (HDFSUSER.BTTN_BKP_PK) violated**** > > ** ** > > at > oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedSta= tement.java:10345) > **** > > at > oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWra= pper.java:230) > **** > > at > org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(As= yncSqlOutputFormat.java:228) > **** > > ** ** > > 13/03/05 19:21:48 WARN mapred.JobClient: Error reading task > outputConnection timed out**** > > 13/03/05 19:22:09 WARN mapred.JobClient: Error reading task > outputConnection timed out**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Job complete: > job_201303051835_0010**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Job Counters**** > > 13/03/05 19:22:09 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=3D77152***= * > > 13/03/05 19:22:09 INFO mapred.JobClient: Total time spent by all > reduces waiting after reserving slots (ms)=3D0**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Total time spent by all maps > waiting after reserving slots (ms)=3D0**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Rack-local map tasks=3D3**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Launched map tasks=3D4**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Data-local map tasks=3D1**** > > 13/03/05 19:22:09 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=3D0**** > > 13/03/05 19:22:09 INFO mapred.JobClient: Failed map tasks=3D1**** > > 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Transferred 0 bytes in > 110.4837 seconds (0 bytes/sec)**** > > 13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Exported 0 records.**** > > 13/03/05 19:22:09 ERROR tool.ExportTool: Error during export: Export job > failed!**** > > *[hadoop@NHCLT-PC44-2 sqoop-oper]$* > > * * > > *Regards,* > > *Ajit Kumar Shreevastava* > > > > ::DISCLAIMER:: > > -------------------------------------------------------------------------= --------------------------------------------------------------------------- > > The contents of this e-mail and any attachment(s) are confidential and > intended for the named recipient(s) only. > E-mail transmission is not guaranteed to be secure or error-free as > information could be intercepted, corrupted, > lost, destroyed, arrive late or incomplete, or may contain viruses in > transmission. The e mail and its contents > (with or without referred errors) shall therefore not attach any liabilit= y > on the originator or HCL or its affiliates. > Views or opinions, if any, presented in this email are solely those of th= e > author and may not necessarily reflect the > views or opinions of HCL or its affiliates. Any form of reproduction, > dissemination, copying, disclosure, modification, > distribution and / or publication of this message without the prior > written consent of authorized representative of > HCL is strictly prohibited. If you have received this email in error > please delete it and notify the sender immediately. > Before opening any email and/or attachments, please check them for viruse= s > and other defects. > > > -------------------------------------------------------------------------= --------------------------------------------------------------------------- > --047d7bdc8d24a9a36f04d72de0ee Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable + sqoop user

The answer is in your exception! Check your data, your = hitting unique key violation.

Thanks,
Abhijeet

On Tue, Mar 5, 2013 at 7:24 PM, Ajit Kumar Shreevastava <Ajit.Shreevastava@hcl.com> wrote:

Hi All,

=A0

I am facing following issue while exporting table fr= om hive to Oracle. Importing table from Oracle to Hive and HDFS is working = fine. Please let me know where I lag. I am pasting my screen output here.

=A0

=A0

[hadoop@NHCLT-PC44-= 2 sqoop-oper]$ sqoop export --connect jdbc:oracle:thin:@10.99.42.11:1521/cloud= db --username HDFSUSER=A0 --table BTTN_BKP --export-dir=A0 /home/hadoop= /user/hive/warehouse/bttn=A0 -P --verbose=A0 -m 1=A0 --input-fields-termina= ted-by '\001'

Warning: /usr/lib/hbase does not exist! HBase import= s will fail.

Please set $HBASE_HOME to the root of your HBase ins= tallation.

13/03/05 19:20:11 DEBUG tool.BaseSqoopTool: Enabled = debug logging.

Enter password:

13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Loaded ma= nager factory: com.cloudera.sqoop.manager.DefaultManagerFactory

13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Trying Ma= nagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory

13/03/05 19:20:16 DEBUG manager.DefaultManagerFactor= y: Trying with scheme: jdbc:oracle:thin:@10.99.42.11

13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCa= che: Instantiated new connection cache.

13/03/05 19:20:16 INFO manager.SqlManager: Using def= ault fetchSize of 1000

13/03/05 19:20:16 DEBUG sqoop.ConnFactory: Instantia= ted ConnManager org.apache.sqoop.manager.OracleManager@2abe0e27

13/03/05 19:20:16 INFO tool.CodeGenTool: Beginning c= ode generation

13/03/05 19:20:16 DEBUG manager.OracleManager: Using= column names query: SELECT t.* FROM BTTN_BKP t WHERE 1=3D0

13/03/05 19:20:16 DEBUG manager.OracleManager: Creat= ing a new connection for jdbc:oracle:thin:@10.99.42.11:1521/clouddb, using username: HDFSUSER

13/03/05 19:20:16 DEBUG manager.OracleManager: No co= nnection paramenters specified. Using regular API for making connection.=

13/03/05 19:20:16 INFO manager.OracleManager: Time z= one has been set to GMT

13/03/05 19:20:16 DEBUG manager.SqlManager: Using fe= tchSize for next query: 1000

13/03/05 19:20:16 INFO manager.SqlManager: Executing= SQL statement: SELECT t.* FROM BTTN_BKP t WHERE 1=3D0

13/03/05 19:20:16 DEBUG manager.OracleManager$ConnCa= che: Caching released connection for jdbc:oracle:thin:@10.99.= 42.11:1521/clouddb/HDFSUSER

13/03/05 19:20:16 DEBUG orm.ClassWriter: selected co= lumns:

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BTTN_= ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 DATA_= INST_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 SCR_I= D

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BTTN_= NU

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 CAT

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 WDTH<= u>

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 HGHT<= u>

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 KEY_S= CAN

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 KEY_S= HFT

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 FRGND= _CPTN_COLR

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 FRGND= _CPTN_COLR_PRSD

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BKGD_= CPTN_COLR

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BKGD_= CPTN_COLR_PRSD

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BLM_F= L

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 LCLZ_= FL

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 MENU_= ITEM_NU

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BTTN_= ASGN_LVL_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 ON_AT= VT

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 ON_CL= IK

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 ENBL_= FL

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BLM_S= ET_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BTTN_= ASGN_LVL_NAME

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 MKT_I= D

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 CRTE_= TS

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 CRTE_= USER_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 UPDT_= TS

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 UPDT_= USER_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 DEL_T= S

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 DEL_U= SER_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 DLTD_= FL

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 MENU_= ITEM_NA

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 PRD_C= D

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BLM_S= ET_NA

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 SOUND= _FILE_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 IS_DY= NMC_BTTN

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 FRGND= _CPTN_COLR_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 FRGND= _CPTN_COLR_PRSD_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BKGD_= CPTN_COLR_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter:=A0=A0 BKGD_= CPTN_COLR_PRSD_ID

13/03/05 19:20:16 DEBUG orm.ClassWriter: Writing sou= rce file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_B= KP.java

13/03/05 19:20:16 DEBUG orm.ClassWriter: Table name:= BTTN_BKP

13/03/05 19:20:16 DEBUG orm.ClassWriter: Columns: BT= TN_ID:2, DATA_INST_ID:2, SCR_ID:2, BTTN_NU:2, CAT:2, WDTH:2, HGHT:2, KEY_SC= AN:2, KEY_SHFT:2, FRGND_CPTN_COLR:12, FRGND_CPTN_COLR_PRSD:12, BKGD_CPTN_CO= LR:12, BKGD_CPTN_COLR_PRSD:12, BLM_FL:2, LCLZ_FL:2, MENU_ITEM_NU:2, BTTN_ASGN_LVL_ID:2, ON_ATVT:2, ON_CLIK:2, ENBL_= FL:2, BLM_SET_ID:2, BTTN_ASGN_LVL_NAME:12, MKT_ID:2, CRTE_TS:93, CRTE_USER_= ID:12, UPDT_TS:93, UPDT_USER_ID:12, DEL_TS:93, DEL_USER_ID:12, DLTD_FL:2, M= ENU_ITEM_NA:12, PRD_CD:2, BLM_SET_NA:12, SOUND_FILE_ID:2, IS_DYNMC_BTTN:2, FRGND_CPTN_COLR_ID:2, FRGND_CPTN_COLR_PR= SD_ID:2, BKGD_CPTN_COLR_ID:2, BKGD_CPTN_COLR_PRSD_ID:2,

13/03/05 19:20:16 DEBUG orm.ClassWriter: sourceFilen= ame is BTTN_BKP.java

13/03/05 19:20:16 DEBUG orm.CompilationManager: Foun= d existing /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/

13/03/05 19:20:16 INFO orm.CompilationManager: HADOO= P_HOME is /home/hadoop/hadoop-1.0.3/libexec/..

13/03/05 19:20:16 DEBUG orm.CompilationManager: Addi= ng source file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/= BTTN_BKP.java

13/03/05 19:20:16 DEBUG orm.CompilationManager: Invo= king javac with args:

13/03/05 19:20:16 DEBUG orm.CompilationManager:=A0= =A0 -sourcepath

13/03/05 19:20:16 DEBUG orm.CompilationManager:=A0= =A0 /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/

13/03/05 19:20:16 DEBUG orm.CompilationManager:=A0= =A0 -d

13/03/05 19:20:16 DEBUG orm.CompilationManager:=A0= =A0 /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/

13/03/05 19:20:16 DEBUG orm.CompilationManager:=A0= =A0 -classpath

13/03/05 19:20:16 DEBUG orm.CompilationManager:=A0= =A0 /home/hadoop/hadoop-1.0.3/libexec/../conf:/usr/java/jdk1.6.0_32/lib/too= ls.jar:/home/hadoop/hadoop-1.0.3/libexec/..:/home/hadoop/hadoop-1.0.3/libex= ec/../hadoop-core-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/asm-3.= 2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/home/ha= doop/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/home/hadoop/hadoop= -1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.0.3= /libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.0.3/= libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib= /commons-codec-1.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-col= lections-3.2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-configu= ration-1.6.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.= 1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/ho= me/hadoop/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoo= p-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.0= .3/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../l= ib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-lo= gging-1.1.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-logging-ap= i-1.0.4.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/= home/hadoop/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/home/hadoop/= hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.0.3/libexe= c/../lib/hadoop-capacity-scheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libe= xec/../lib/hadoop-fairscheduler-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec= /../lib/hadoop-thriftfs-1.0.3.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/= hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-core-a= sl-1.8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.= 8.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar= :/home/hadoop/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/home/h= adoop/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.0.3/li= bexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/j= ersey-json-1.8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jersey-server-1= .8.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/home/hado= op/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.0.3/l= ibexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.0.3/libexec/../li= b/jsch-0.1.42.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/h= ome/hadoop/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.= 0.3/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.0.3/libexec/../li= b/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/oro-2.0.8.= jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/= home/hadoop/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/ha= doop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.0.3= /libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/js= p-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-= 2.1.jar:/home/hadoop/sqoop/conf::/home/hadoop/sqoop/lib/ant-contrib-1.0b3.j= ar:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/hadoop/sqoop/lib= /avro-1.5.3.jar:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar:/home/hadoop/sqoo= p/lib/avro-mapred-1.5.3.jar:/home/hadoop/sqoop/lib/commons-io-1.4.jar:/home= /hadoop/sqoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/sqoop/lib/jackson-core-a= sl-1.7.3.jar:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/hado= op/sqoop/lib/jopt-simple-3.2.jar:/home/hadoop/sqoop/lib/ojdbc6.jar:/home/ha= doop/sqoop/lib/paranamer-2.3.jar:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2= .jar:/home/hadoop/sqoop/sqoop-1.4.2.jar:/home/hadoop/sqoop/sqoop-test-1.4.2= .jar::/home/hadoop/hadoop-1.0.3/hadoop-core-1.0.3.jar:/home/hadoop/sqoop/sq= oop-1.4.2.jar

Note: /tmp/sqoop-hadoop/compile/8d22103beede09e961b6= 4d0ff8e61e7e/BTTN_BKP.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.=

13/03/05 19:20:18 INFO orm.CompilationManager: Writi= ng jar file: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTT= N_BKP.jar

13/03/05 19:20:18 DEBUG orm.CompilationManager: Scan= ning for .class files in directory: /tmp/sqoop-hadoop/compile/8d22103beede0= 9e961b64d0ff8e61e7e

13/03/05 19:20:18 DEBUG orm.CompilationManager: Got = classfile: /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e61e7e/BTTN_= BKP.class -> BTTN_BKP.class

13/03/05 19:20:18 DEBUG orm.CompilationManager: Fini= shed writing jar file /tmp/sqoop-hadoop/compile/8d22103beede09e961b64d0ff8e= 61e7e/BTTN_BKP.jar

13/03/05 19:20:18 INFO mapreduce.ExportJobBase: Begi= nning export of BTTN_BKP

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Using Inp= utFormat: class org.apache.sqoop.mapreduce.ExportInputFormat<= /p>

13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCa= che: Got cached connection for jdbc:oracle:thin:@10.99.42.11:= 1521/clouddb/HDFSUSER

13/03/05 19:20:18 INFO manager.OracleManager: Time z= one has been set to GMT

13/03/05 19:20:18 DEBUG manager.OracleManager$ConnCa= che: Caching released connection for jdbc:oracle:thin:@10.99.= 42.11:1521/clouddb/HDFSUSER

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/sqoop-1.4.2.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/jackson-mapper-asl-1.7.3.jar=

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/hsqldb-1.8.0.10.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/avro-ipc-1.5.3.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/jopt-simple-3.2.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/ojdbc6.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/jackson-core-asl-1.7.3.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/ant-contrib-1.0b3.jar=

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/snappy-java-1.0.3.2.jar<= u>

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/paranamer-2.3.jar=

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/avro-1.5.3.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/commons-io-1.4.jar

13/03/05 19:20:18 DEBUG mapreduce.JobBase: Adding to= job classpath: file:/home/hadoop/sqoop/lib/avro-mapred-1.5.3.jar=

13/03/05 19:20:19 INFO input.FileInputFormat: Total = input paths to process : 4

13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:= Target numMapTasks=3D1

13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:= Total input bytes=3D184266237

13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:= maxSplitSize=3D184266237

13/03/05 19:20:19 INFO input.FileInputFormat: Total = input paths to process : 4

13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:= Generated splits:

13/03/05 19:20:19 DEBUG mapreduce.ExportInputFormat:= =A0=A0 Paths:/home/hadoop/user/hive/warehouse/bttn/part-m-00000:0+20908340,= /home/hadoop/user/hive/warehouse/bttn/part-m-00001:0+67108864,/home/hadoop/= user/hive/warehouse/bttn/part-m-00001:67108864+24822805,/home/hadoop/user/h= ive/warehouse/bttn/part-m-00002:0+26675150,/home/hadoop/user/hive/warehouse= /bttn/part-m-00003:0+44751078 Locations:NHCLT-PC44-2.hclt.corp.hcl.in:;

13/03/05 19:20:19 INFO mapred.JobClient: Running job= : job_201303051835_0010

13/03/05 19:20:20 INFO mapred.JobClient:=A0 map 0% r= educe 0%

13/03/05 19:20:36 INFO mapred.JobClient:=A0 map 7% r= educe 0%

13/03/05 19:20:39 INFO mapred.JobClient:=A0 map 11% = reduce 0%

13/03/05 19:20:42 INFO mapred.JobClient:=A0 map 16% = reduce 0%

13/03/05 19:20:45 INFO mapred.JobClient:=A0 map 17% = reduce 0%

13/03/05 19:20:48 INFO mapred.JobClient:=A0 map 20% = reduce 0%

13/03/05 19:20:51 INFO mapred.JobClient:=A0 map 27% = reduce 0%

13/03/05 19:20:54 INFO mapred.JobClient:=A0 map 32% = reduce 0%

13/03/05 19:20:57 INFO mapred.JobClient:=A0 map 33% = reduce 0%

13/03/05 19:21:01 INFO mapred.JobClient:=A0 map 38% = reduce 0%

13/03/05 19:21:04 INFO mapred.JobClient:=A0 map 39% = reduce 0%

13/03/05 19:21:07 INFO mapred.JobClient:=A0 map 43% = reduce 0%

13/03/05 19:21:10 INFO mapred.JobClient:=A0 map 44% = reduce 0%

13/03/05 19:21:13 INFO mapred.JobClient:=A0 map 48% = reduce 0%

13/03/05 19:21:18 INFO mapred.JobClient: Task Id : a= ttempt_201303051835_0010_m_000000_0, Status : FAILED

java.util.NoSuchElementException

=A0=A0=A0=A0=A0=A0=A0 at java.util.AbstractList$Itr.= next(AbstractList.java:350)

=A0=A0=A0=A0=A0=A0=A0 at BTTN_BKP.__loadFromFields(B= TTN_BKP.java:1349)

=A0=A0=A0=A0=A0=A0=A0 at BTTN_BKP.parse(BTTN_BKP.jav= a:1148)

=A0=A0=A0=A0=A0=A0=A0at org.apache.sqoop.mapreduce.T= extExportMapper.map(TextExportMapper.java:77)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.sqoop.mapreduce.= TextExportMapper.map(TextExportMapper.java:36)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapreduce= .Mapper.run(Mapper.java:144)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.sqoop.mapreduce.= AutoProgressMapper.run(AutoProgressMapper.java:182)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ma= pTask.runNewMapper(MapTask.java:764)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ma= pTask.run(MapTask.java:370)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ch= ild$4.run(Child.java:255)

=A0=A0=A0=A0=A0=A0=A0 at java.security.AccessControl= ler.doPrivileged(Native Method)

=A0=A0=A0=A0=A0=A0=A0 at javax.security.auth.Subject= .doAs(Subject.java:396)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.security.= UserGroupInformation.doAs(UserGroupInformation.java:1121)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ch= ild.main(Child.java:249)

=A0

13/03/05 19:21:19 INFO mapred.JobClient:=A0 map 0% r= educe 0%

13/03/05 19:21:27 INFO mapred.JobClient: Task Id : a= ttempt_201303051835_0010_m_000000_1, Status : FAILED

java.io.IOException: java.sql.BatchUpdateException: = ORA-00001: unique constraint (HDFSUSER.BTTN_BKP_PK) violated<= /p>

=A0

=A0=A0=A0=A0=A0=A0=A0 at org.apache.sqoop.mapreduce.= AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.sqoop.mapreduce.= AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ma= pTask$NewDirectOutputCollector.write(MapTask.java:639)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapreduce= .TaskInputOutputContext.write(TaskInputOutputContext.java:80)=

=A0=A0=A0=A0=A0 =A0=A0at org.apache.sqoop.mapreduce.= TextExportMapper.map(TextExportMapper.java:78)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.sqoop.mapreduce.= TextExportMapper.map(TextExportMapper.java:36)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapreduce= .Mapper.run(Mapper.java:144)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.sqoop.mapreduce.= AutoProgressMapper.run(AutoProgressMapper.java:182)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ma= pTask.runNewMapper(MapTask.java:764)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ma= pTask.run(MapTask.java:370)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ch= ild$4.run(Child.java:255)

=A0=A0=A0=A0=A0=A0=A0 at java.security.AccessControl= ler.doPrivileged(Native Method)

=A0=A0=A0=A0=A0=A0=A0 at javax.security.auth.Subject= .doAs(Subject.java:396)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.security.= UserGroupInformation.doAs(UserGroupInformation.java:1121)

=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.Ch= ild.main(Child.java:249)

Caused by: java.sql.BatchUpdateException: ORA-00001:= unique constraint (HDFSUSER.BTTN_BKP_PK) violated

=A0

=A0=A0=A0=A0=A0=A0=A0 at oracle.jdbc.driver.OraclePr= eparedStatement.executeBatch(OraclePreparedStatement.java:10345)<= /u>

=A0=A0=A0=A0=A0=A0=A0 at oracle.jdbc.driver.OracleSt= atementWrapper.executeBatch(OracleStatementWrapper.java:230)<= /p>

=A0=A0=A0=A0=A0=A0=A0 at org.apache.sqoop.mapreduce.= AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:228)<= u>

=A0

13/03/05 19:21:48 WARN mapred.JobClient: Error readi= ng task outputConnection timed out

13/03/05 19:22:09 WARN mapred.JobClient: Error readi= ng task outputConnection timed out

13/03/05 19:22:09 INFO mapred.JobClient: Job complet= e: job_201303051835_0010

13/03/05 19:22:09 INFO mapred.JobClient: Counters: 8=

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0 Job C= ounters

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= SLOTS_MILLIS_MAPS=3D77152

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= Total time spent by all reduces waiting after reserving slots (ms)=3D0<= /u>

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= Total time spent by all maps waiting after reserving slots (ms)=3D0=

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= Rack-local map tasks=3D3

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= Launched map tasks=3D4

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= Data-local map tasks=3D1

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= SLOTS_MILLIS_REDUCES=3D0

13/03/05 19:22:09 INFO mapred.JobClient:=A0=A0=A0=A0= Failed map tasks=3D1

13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Tran= sferred 0 bytes in 110.4837 seconds (0 bytes/sec)

13/03/05 19:22:09 INFO mapreduce.ExportJobBase: Expo= rted 0 records.

13/03/05 19:22:09 ERROR tool.ExportTool: Error durin= g export: Export job failed!

[hadoop@NHCLT-PC44-= 2 sqoop-oper]$

=A0

Regards,<= /u>

Ajit Kumar Shreevas= tava



= ::DISCLAIMER::
---------------------------------------------------------= ---------------------------------------------------------------------------= ----------------

The con= tents of this e-mail and any attachment(s) are confidential and intended fo= r the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as informa= tion could be intercepted, corrupted,
lost, destroyed, arrive late or i= ncomplete, or may contain viruses in transmission. The e mail and its conte= nts
(with or without referred errors) shall therefore not attach any liability = on the originator or HCL or its affiliates.
Views or opinions, if any, = presented in this email are solely those of the author and may not necessar= ily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, disse= mination, copying, disclosure, modification,
distribution and / or publ= ication of this message without the prior written consent of authorized rep= resentative of
HCL is strictly prohibited. If you have received this email in error please= delete it and notify the sender immediately.
Before opening any email = and/or attachments, please check them for viruses and other defects.
=

-------= ---------------------------------------------------------------------------= ------------------------------------------------------------------


--047d7bdc8d24a9a36f04d72de0ee--