hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Girish Lingappa <glinga...@pivotal.io>
Subject Re: How to automate the Sqoop script in Production environment
Date Fri, 24 Oct 2014 15:38:41 GMT
Ravi 
If you are using oozie in your production environment one option is to plugin your sqoop job
into the oozie workflow xml using oozie sqoop action.

Thanks
Girish

Sent from my iPhone

> On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik <Karthik.Dhandapani@CVSCaremark.com>
wrote:
> 
> Hi,
> 
> There is an option.
> 
> Use  --password-file          Set path for file containing authentication password
> 
> http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
> 
> All the dynamic parameter values can be passed in as unix variables to automate the sqoop
script for different tables.  Copy the below script to .sh file and run the script from any
scheduler.
> 
> Thanks,
> Karthik
> ________________________________________
> From: Ravi Prasad [raviprasad29@gmail.com]
> Sent: Friday, October 24, 2014 7:05 AM
> To: user@hadoop.apache.org
> Subject: How to automate the Sqoop script in Production environment
> 
> Hi all,
> 
> 1)  Can anyone please suggest me , how to automate the Sqoop scripts in the production
environment.
> 
> I  need to import data from Oracle tables to  Hadoop Hive tables using the below scripts.
> 
> sqoop import  --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB> --username
username --password password --table <tablename> --columns column1 ,column2,column3--hive-import
--hive-overwrite  --hive-table default.oracreport --lines-terminated-by '\n' --fields-terminated-by
',' --target-dir /user/hdfs/
> 
> 
> 2) Is there any way to hide the password.
> 
> ----------------------------------------------
> Regards,
> RAVI PRASAD. T

Mime
View raw message