hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kumar Jayapal <kjayapa...@gmail.com>
Subject Re: sqoop export query error out..
Date Thu, 16 Jul 2015 18:55:36 GMT
Hi James,

I tried with uppercase and it worked now I am getting different error. I
don't see the .metadata dir in the path.

15/07/16 18:49:12 ERROR sqoop.Sqoop: Got exception running Sqoop:
org.kitesdk.data.DatasetNotFoundException: Descriptor location does not
exist:
hdfs://name/user/hive/warehouse/testing_bi.db/testing_dimension/.metadata
org.kitesdk.data.DatasetNotFoundException: Descriptor location does not
exist:
hdfs://name/user/hive/warehouse/testing_bi.db/testing_dimension/.metadata
        at
org.kitesdk.data.spi.filesystem.FileSystemMetadataProvider.checkExists(FileSystemMetadataProvider.java:527)
        at
org.kitesdk.data.spi.filesystem.FileSystemMetadataProvider.find(FileSystemMetadataProvider.java:570)
        at
org.kitesdk.data.spi.filesystem.FileSystemMetadataProvider.load(FileSystemMetadataProvider.java:112)
        at
org.kitesdk.data.spi.filesystem.FileSystemDatasetRepository.load(FileSystemDatasetRepository.java:228)
        at org.kitesdk.data.Datasets.load(Datasets.java:69)
        at org.kitesdk.data.Datasets.load(Datasets.java:113)
        at
org.kitesdk.data.mapreduce.DatasetKeyInputFormat.load(DatasetKeyInputFormat.java:274)
        at
org.kitesdk.data.mapreduce.DatasetKeyInputFormat.setConf(DatasetKeyInputFormat.java:214)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at
org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:586)
        at
org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:606)
        at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:490)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
        at
org.apache.sqoop.mapreduce.ExportJobBase.doSubmitJob(ExportJobBase.java:302)
        at
org.apache.sqoop.mapreduce.ExportJobBase.runJob(ExportJobBase.java:279)
        at
org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:420)
        at
org.apache.sqoop.manager.OracleManager.exportTable(OracleManager.java:455)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)









Thanks
Jay

On Wed, Jul 15, 2015 at 9:12 PM, James Bond <bond.bhai@gmail.com> wrote:

> This clearly says its an oracle error. Try checking for the following -
> 1. If the user "TESTUSER" has Read/write privileges.
> 2. If the table TEST_DIMESION is the same schema as TESTUSER, if not try
> appending the schema name to the table - <schema/owner>.TEST_DIMENSION.
> 3. Make sure you are connecting to the right server and service name.
>
> Thanks,
> Ashwin
>
>
> On Thu, Jul 16, 2015 at 5:05 AM, Kumar Jayapal <kjayapal17@gmail.com>
> wrote:
>
>> Hi,
>>
>> I see the table in my database. When I execute a export command I get
>> table or view does not exist. When I do list-tables I can see the table.
>>
>> Can some one tell me what wrong. I have DBA privileges
>>
>> sqoop export --connect "jdbc:oracle:thin:@lorsasa.ss.com:1521/pocd01us"
>>  --username " TESTUSER" --P --table "TEST_DIMENSION" --export-dir
>> "/user/hive/warehouse/test.db/test_dimension/"
>> Warning:
>> /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo
>> does not exist! Accumulo imports will fail.
>> Please set $ACCUMULO_HOME to the root of your Accumulo installation.
>> 15/07/15 23:20:03 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
>> Enter password:
>> 15/07/15 23:20:20 INFO oracle.OraOopManagerFactory: Data Connector for
>> Oracle and Hadoop is disabled.
>> 15/07/15 23:20:20 INFO manager.SqlManager: Using default fetchSize of 1000
>> 15/07/15 23:20:20 INFO tool.CodeGenTool: Beginning code generation
>> 15/07/15 23:20:21 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 15/07/15 23:20:21 INFO manager.SqlManager: Executing SQL statement:
>> SELECT t.* FROM TEST_DIMENSION t WHERE 1=0
>> 15/07/15 23:20:21 ERROR manager.SqlManager: Error executing statement:
>> java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist
>>
>> java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist
>>
>>         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
>>         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:396)
>>         at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:951)
>>         at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:513)
>>         at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:227)
>>         at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:531)
>>         at
>> oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:208)
>>         at
>> oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:886)
>>         at
>> oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1175)
>>         at
>> oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1296)
>>         at
>> oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3613)
>>         at
>> oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3657)
>>         at
>> oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1495)
>>         at
>> org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:750)
>>         at
>> org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:759)
>>         at
>> org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:269)
>>         at
>> org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
>>         at
>> org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
>>         at
>> org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
>>         at
>> org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
>>         at
>> org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
>>         at
>> org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
>>         at
>> org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
>>         at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
>>         at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
>>         at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
>> 15/07/15 23:20:21 ERROR tool.ExportTool: Encountered IOException running
>> export job: java.io.IOException: No columns to generate for ClassWriter
>>
>>
>>
>>
>>
>> Thanks
>> Jay
>>
>
>

Mime
View raw message