sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrey Dmitriev (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SQOOP-1329) JDBC connection to Oracle timeout after data import but before hive metadata import
Date Thu, 29 May 2014 23:55:03 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-1329?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14013115#comment-14013115
] 

Andrey Dmitriev commented on SQOOP-1329:
----------------------------------------

Hi Gwen,

Yes it is tested. Maybe this is not the best solution, but it works and solves the problem
and it has minimal impact to the existing code.
I set the connection to null, to make sure that it will be passed to the next step where new
connection will be established.
This is my first time, so please let me know if I'm doing something wrong.

Thank you,
Andrey



> JDBC connection to Oracle timeout after data import but before hive metadata import
> -----------------------------------------------------------------------------------
>
>                 Key: SQOOP-1329
>                 URL: https://issues.apache.org/jira/browse/SQOOP-1329
>             Project: Sqoop
>          Issue Type: Bug
>          Components: connectors/oracle
>    Affects Versions: 1.4.4
>         Environment: Red Hat Enterprise Linux Server release 6.5
>            Reporter: Andrey Dmitriev
>            Priority: Critical
>              Labels: oracle
>             Fix For: 1.4.4
>
>         Attachments: SQOOP-1329.patch
>
>
> When I'm importing a table from Oracle which takes more than 1 hour to extract, I'm getting
following error message at the stage when Sqoop tries to import data from temporary HDFS location
to Hive:
> {quote}
> 14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644
seconds (6.7206 MB/sec)
> 14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
> 14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
> 14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
> 14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
> 14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM
WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
> 14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.*
FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
> 14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException:
ORA-02396: exceeded maximum idle time, please connect again
> java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again
> 	at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
> 	at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
> {quote}
> With small tables (under 1 hour) everything is fine.
> I'm using Sqoop v1.4.4
> {quote}
> 14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
> Sqoop 1.4.4-cdh5.0.0
> git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8
> {quote}
> This problems looks exactly as described in (SQOOP-934) issue.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message