Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 2F6A8200C06 for ; Fri, 27 Jan 2017 16:41:32 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 2E21F160B5C; Fri, 27 Jan 2017 15:41:32 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 58F2A160B47 for ; Fri, 27 Jan 2017 16:41:31 +0100 (CET) Received: (qmail 52337 invoked by uid 500); 27 Jan 2017 15:41:30 -0000 Mailing-List: contact dev-help@sqoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@sqoop.apache.org Delivered-To: mailing list dev@sqoop.apache.org Received: (qmail 52324 invoked by uid 99); 27 Jan 2017 15:41:30 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 27 Jan 2017 15:41:30 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id EA51FC1AA9 for ; Fri, 27 Jan 2017 15:41:29 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -1.199 X-Spam-Level: X-Spam-Status: No, score=-1.199 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_LAZY_DOMAIN_SECURITY=1, RP_MATCHES_RCVD=-2.999] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id 9O86tBOnGUGw for ; Fri, 27 Jan 2017 15:41:29 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id D2A055F4A7 for ; Fri, 27 Jan 2017 15:41:28 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 6B652E043B for ; Fri, 27 Jan 2017 15:41:25 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 9333625295 for ; Fri, 27 Jan 2017 15:41:24 +0000 (UTC) Date: Fri, 27 Jan 2017 15:41:24 +0000 (UTC) From: "Dmitry Zagorulkin (JIRA)" To: dev@sqoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (SQOOP-3123) Import from oracle using oraoop with map-column-java to avro fails if special characters encounter in table name or column name MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Fri, 27 Jan 2017 15:41:32 -0000 [ https://issues.apache.org/jira/browse/SQOOP-3123?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Dmitry Zagorulkin updated SQOOP-3123: ------------------------------------- Flags: Patch > Import from oracle using oraoop with map-column-java to avro fails if special characters encounter in table name or column name > -------------------------------------------------------------------------------------------------------------------------------- > > Key: SQOOP-3123 > URL: https://issues.apache.org/jira/browse/SQOOP-3123 > Project: Sqoop > Issue Type: Bug > Components: codegen > Affects Versions: 1.4.6, 1.4.7 > Reporter: Dmitry Zagorulkin > Fix For: 1.4.7 > > Attachments: SQOOP_3123.patch > > > I'm trying to import data from oracle to avro using oraoop. > My table: > {code} > CREATE TABLE "IBS"."BRITISH#CATS" > ( "ID" NUMBER, > "C_CODE" VARCHAR2(10), > "C_USE_START#DATE" DATE, > "C_USE_USE#NEXT_DAY" VARCHAR2(1), > "C_LIM_MIN#DAT" DATE, > "C_LIM_MIN#TIME" TIMESTAMP, > "C_LIM_MIN#SUM" NUMBER, > "C_OWNCODE" VARCHAR2(1), > "C_LIMIT#SUM_LIMIT" NUMBER(17,2), > "C_L@M" NUMBER(17,2), > "C_1_THROW" NUMBER NOT NULL ENABLE, > "C_#_LIMITS" NUMBER NOT NULL ENABLE > ) SEGMENT CREATION IMMEDIATE > PCTFREE 70 PCTUSED 40 INITRANS 2 MAXTRANS 255 > NOCOMPRESS LOGGING > STORAGE(INITIAL 2097152 NEXT 524288 MINEXTENTS 1 MAXEXTENTS 2147483645 > PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 > BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) > TABLESPACE "WORK" ; > {code} > My first script is: > {code} > ./sqoop import \ > -Doraoop.timestamp.string=false \ > --direct \ > --connect jdbc:oracle:thin:@localhost:49161:XE \ > --username system \ > --password oracle \ > --table IBS.BRITISH#CATS \ > --target-dir /Users/Dmitry/Developer/Java/sqoop/bin/imported \ > --as-avrodatafile \ > --map-column-java ID=String,C_CODE=String,C_USE_START#DATE=String,C_USE_USE#NEXT_DAY=String,C_LIM_MIN#DAT=String,C_LIM_MIN#TIME=String,C_LIM_MIN#SUM=String,C_OWNCODE=String,C_LIMIT#SUM_LIMIT=String,C_L_M=String,C_1_THROW=String,C_#_LIMITS=String > {code} > fails with > {code} > 2017-01-13 16:11:21,348 ERROR [main] tool.ImportTool (ImportTool.java:run(625)) - Import failed: No column by the name C_LIMIT#SUM_LIMITfound while importing data; expecting one of [C_LIMIT_SUM_LIMIT, C_OWNCODE, C_L_M, C___LIMITS, C_LIM_MIN_DAT, C_1_THROW, C_CODE, C_USE_START_DATE, C_LIM_MIN_SUM, ID, C_LIM_MIN_TIME, C_USE_USE_NEXT_DAY] > {code} > After i've found that sqoop has replaced all special characters with underscore. My second script is: > {code} > ./sqoop import \ > -D oraoop.timestamp.string=false \ > --direct \ > --connect jdbc:oracle:thin:@localhost:49161:XE \ > --username system \ > --password oracle \ > --table IBS.BRITISH#CATS \ > --target-dir /Users/Dmitry/Developer/Java/sqoop/bin/imported \ > --as-avrodatafile \ > --map-column-java ID=String,C_CODE=String,C_USE_START_DATE=String,C_USE_USE_NEXT_DAY=String,C_LIM_MIN_DAT=String,C_LIM_MIN_TIME=String,C_LIM_MIN_SUM=String,C_OWNCODE=String,C_LIMIT_SUM_LIMIT=String,C_L_M=String,C_1_THROW=String,C___LIMITS=String \ > --verbose > {code} > Fails with: Caused by: org.apache.avro.UnresolvedUnionException: Not in union ["null","long"]: 2017-01-13 11:22:53.0 > {code} > 2017-01-13 16:14:54,687 WARN [Thread-26] mapred.LocalJobRunner (LocalJobRunner.java:run(560)) - job_local1372531461_0001 > java.lang.Exception: org.apache.avro.file.DataFileWriter$AppendWriteException: org.apache.avro.UnresolvedUnionException: Not in union ["null","long"]: 2017-01-13 11:22:53.0 > at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) > at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) > Caused by: org.apache.avro.file.DataFileWriter$AppendWriteException: org.apache.avro.UnresolvedUnionException: Not in union ["null","long"]: 2017-01-13 11:22:53.0 > at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:308) > at org.apache.sqoop.mapreduce.AvroOutputFormat$1.write(AvroOutputFormat.java:112) > at org.apache.sqoop.mapreduce.AvroOutputFormat$1.write(AvroOutputFormat.java:108) > at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:655) > at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) > at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) > at org.apache.sqoop.mapreduce.AvroImportMapper.map(AvroImportMapper.java:73) > at org.apache.sqoop.mapreduce.AvroImportMapper.map(AvroImportMapper.java:39) > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) > at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) > at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.apache.avro.UnresolvedUnionException: Not in union ["null","long"]: 2017-01-13 11:22:53.0 > at org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:709) > at org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:192) > at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:110) > at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73) > at org.apache.avro.reflect.ReflectDatumWriter.write(ReflectDatumWriter.java:150) > at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:153) > at org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:90) > at org.apache.avro.reflect.ReflectDatumWriter.writeField(ReflectDatumWriter.java:182) > at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:143) > at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:105) > at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:73) > at org.apache.avro.reflect.ReflectDatumWriter.write(ReflectDatumWriter.java:150) > at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:60) > at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:302) > ... 17 more > {code} > I've found that old problem and *oraoop.timestamp.string=false* must solve it, but it does not. > What do you think? > Also please assign this problem to me. -- This message was sent by Atlassian JIRA (v6.3.4#6332)