hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dhandapani, Karthik" <Karthik.Dhandap...@CVSCaremark.com>
Subject RE: How to automate the Sqoop script in Production environment
Date Fri, 24 Oct 2014 11:17:31 GMT
Hi,

There is an option.

Use  --password-file          Set path for file containing authentication password

http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html

All the dynamic parameter values can be passed in as unix variables to automate the sqoop
script for different tables.  Copy the below script to .sh file and run the script from any
scheduler.

Thanks,
Karthik
________________________________________
From: Ravi Prasad [raviprasad29@gmail.com]
Sent: Friday, October 24, 2014 7:05 AM
To: user@hadoop.apache.org
Subject: How to automate the Sqoop script in Production environment

Hi all,

1)  Can anyone please suggest me , how to automate the Sqoop scripts in the production environment.

I  need to import data from Oracle tables to  Hadoop Hive tables using the below scripts.

sqoop import  --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username username
--password password --table <tablename> --columns column1 ,column2,column3--hive-import
--hive-overwrite  --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by
',' --target-dir /user/hdfs/


2) Is there any way to hide the password.

----------------------------------------------
Regards,
RAVI PRASAD. T

Mime
View raw message