hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sunita Arvind <sunitarv...@gmail.com>
Subject Seeking Help configuring log4j for sqoop import into hive
Date Mon, 11 Nov 2013 21:48:31 GMT
Hello,

I am using sqoop to import data from oracle into hive. Below is my SQL:

nohup sqoop import --connect "jdbc:oracle:thin:@(DESCRIPTION = (ADDRESS =
(PROTOCOL = TCP)(HOST = xxxxxxx)(PORT = xxxx)) (CONNECT_DATA = (SERVER =
DEDICATED) (SERVICE_NAME = CDWQ.tms.toyota.com) (FAILOVER_MODE=
(TYPE=select) (METHOD=basic))))"  --username "xxxx"  --password "xxxx"
--split-by employeeid --query  "SELECT e.employeeid,p.salary from employee
e, payroll p
where e.employeeid =p.employeeid and $CONDITIONS"       --create-hive-table
 --hive-table "EMPLOYEE" --hive-import --target-dir
"/user/hive/warehouse/employee" --direct --verbose


Note: This is production data hence I cannot share the log file or actual
query. Sorry for that.

Similar query works for some tables and for this particular table, there is
an exception as below:

java.io.IOException: SQLException in nextKeyValue
        at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:266)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484)
        at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
        at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
        at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
        at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.sql
attempt_201311071517_0011_m_000003_0: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201311071517_0011_m_000003_0: log4j:WARN Please initialize the
log4j system properly.
attempt_201311071517_0011_m_000003_0: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/11/11 11:35:20 INFO mapred.JobClient: Task Id :
attempt_201311071517_0011_m_000000_0, Status : FAILED


I eye-balled to see date format issues, which is typically the root cause
for such issues, as per forums. But that does not seem to be the case here
(I could be wrong). I also added the "-direct" option as suggested by some
posts and that did not help either.

The actual exception after the "caused by" is missing, which makes me
believe that sqoop is trying to redirect the output to some log file and it
does not find the necessary configurations. Hence it is not dumping the
actual stacktrace.

*Seeking help from the community to understand how to configure sqoop to
display the complete stacktrace ?*

I looked at the log4j.properties in the environment but did not find
anything specific to sqoop:
./etc/cloudera-scm-server/log4j.properties
./etc/hadoop/conf.cloudera.mapreduce1/log4j.properties
./etc/hadoop/conf.cloudera.hdfs1/log4j.properties
./etc/hadoop/conf.empty/log4j.properties
./etc/hadoop-0.20/conf.cloudera.mapreduce1/log4j.properties
./etc/hadoop-0.20/conf.cloudera.hdfs1/log4j.properties
./etc/hue/log4j.properties
./etc/hbase/conf.dist/log4j.properties
./etc/zookeeper/conf.dist/log4j.properties
./etc/pig/conf.dist/log4j.properties
./var/run/cloudera-scm-agent/process/303-mapreduce-TASKTRACKER/log4j.properties
./var/run/cloudera-scm-agent/process/321-hdfs-SECONDARYNAMENODE/log4j.properties
./var/run/cloudera-scm-agent/process/311-hue-BEESWAX_SERVER/hadoop-conf/log4j.properties
./var/run/cloudera-scm-agent/process/307-oozie-OOZIE_SERVER/hadoop-conf/log4j.properties
./var/run/cloudera-scm-agent/process/307-oozie-OOZIE_SERVER/log4j.properties
./var/run/cloudera-scm-agent/process/315-impala-IMPALAD/impala-conf/log4j.properties
./var/run/cloudera-scm-agent/process/308-hive-HIVEMETASTORE/hadoop-conf/log4j.properties

regards
Sunita

Mime
View raw message