sqoop-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Rafael Pecin Ferreira (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SQOOP-2842) Sqoop import job fails on large table
Date Tue, 10 Jan 2017 13:45:58 GMT

    [ https://issues.apache.org/jira/browse/SQOOP-2842?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15815001#comment-15815001
] 

Rafael Pecin Ferreira commented on SQOOP-2842:
----------------------------------------------

I'm having the same issue with CDH Sqoop bundle.

> Sqoop import job fails on large table
> -------------------------------------
>
>                 Key: SQOOP-2842
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2842
>             Project: Sqoop
>          Issue Type: Bug
>          Components: sqoop2-hdfs-connector, sqoop2-jdbc-connector, sqoop2-shell
>    Affects Versions: 1.99.6
>         Environment: OS X Yosemite 10.10.5, Sqoop 1.99.6, Hadoop 2.7.2 (Homebrew installation),
Oracle 11 
>            Reporter: Brian Vanover
>              Labels: easyfix, newbie
>             Fix For: 1.99.6
>
>   Original Estimate: 3h
>  Remaining Estimate: 3h
>
> I am prototyping migration of a large record set generated by a computationally expensive
custom query. This query takes approximately 1-2 hours to return a result set in SQL Developer
> I am attempting to pass this query to a simple Sqoop job with links JDBC to HDFS
> I have encountered the following errors in my logs:
> 2016-02-12 10:15:50,690 ERROR mr.SqoopOutputFormatLoadExecutor [org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread.run(SqoopOutputFormatLoadExecutor.java:257)]
Error while loading data out of MR job. org.apache.sqoop.common.SqoopException: GENERIC_HDFS_CONNECTOR_0005:Error
occurs during loader run at org.apache.sqoop.connector.hdfs.HdfsLoader.load(HdfsLoader.java:110)
at org.apache.sqoop.connector.hdfs.HdfsLoader.load(HdfsLoader.java:41) at org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread.run(SqoopOutputFormatLoadExecutor.java:250)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException):
No lease on /user/username/schema/recordset/.72dee005-3f75-4a95-bbc0-30c6b565d193/f5aeeecc-097e-49ab-99cc-b5032ae18a84.txt
(inode 16415): File does not exist. [Lease. Holder: DFSClient_NONMAPREDUCE_-1820866823_31,
pendingcreates: 1]
> When I try to check the resulting .txt files in my hdfs, they are empty.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message