hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Prasad <raviprasa...@gmail.com>
Subject Re: How to automate the Sqoop script in Production environment
Date Sat, 25 Oct 2014 04:14:53 GMT
Thanks a lot Karthik, Girish and Laurent

On Fri, Oct 24, 2014 at 9:30 PM, Laurent H <laurent.hatier@gmail.com> wrote:

> That's right, it's better to use oozie schedulor for your production
> environment ! (can check easily treament status & logs) Check the link
> below : http://oozie.apache.org/docs/4.0.0/DG_SqoopActionExtension.html
>
>
>
> --
> Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
> fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
> <http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
>
> 2014-10-24 17:38 GMT+02:00 Girish Lingappa <glingappa@pivotal.io>:
>
>> Ravi
>> If you are using oozie in your production environment one option is to
>> plugin your sqoop job into the oozie workflow xml using oozie sqoop action.
>>
>> Thanks
>> Girish
>>
>> Sent from my iPhone
>>
>> > On Oct 24, 2014, at 4:17 AM, Dhandapani, Karthik
>> <Karthik.Dhandapani@CVSCaremark.com> wrote:
>> >
>> > Hi,
>> >
>> > There is an option.
>> >
>> > Use  --password-file          Set path for file containing
>> authentication password
>> >
>> > http://sqoop.apache.org/docs/1.4.4/SqoopUserGuide.html
>> >
>> > All the dynamic parameter values can be passed in as unix variables to
>> automate the sqoop script for different tables.  Copy the below script to
>> .sh file and run the script from any scheduler.
>> >
>> > Thanks,
>> > Karthik
>> > ________________________________________
>> > From: Ravi Prasad [raviprasad29@gmail.com]
>> > Sent: Friday, October 24, 2014 7:05 AM
>> > To: user@hadoop.apache.org
>> > Subject: How to automate the Sqoop script in Production environment
>> >
>> > Hi all,
>> >
>> > 1)  Can anyone please suggest me , how to automate the Sqoop scripts in
>> the production environment.
>> >
>> > I  need to import data from Oracle tables to  Hadoop Hive tables using
>> the below scripts.
>> >
>> > sqoop import  --connect jdbc:oracle:thin:@<ipaddress>:1521/<DB>
>> --username username --password password --table <tablename> --columns
>> column1 ,column2,column3--hive-import --hive-overwrite  --hive-table
>> default.oracreport --lines-terminated-by '\n' --fields-terminated-by ','
>> --target-dir /user/hdfs/
>> >
>> >
>> > 2) Is there any way to hide the password.
>> >
>> > ----------------------------------------------
>> > Regards,
>> > RAVI PRASAD. T
>>
>
>


-- 
----------------------------------------------
Regards,
RAVI PRASAD. T

Mime
View raw message