hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bertrand Dechoux <decho...@gmail.com>
Subject Re: Sqoop and Hadoop
Date Wed, 10 Jul 2013 08:06:31 GMT
Hi,

You don't need hive nor hbase. A basic hadoop system (hdfs + mapreduce) is
enough.
I believe the documentation is well done.
If you have further questions, you should ask the correct mailing list.
http://sqoop.apache.org/mail-lists.html

Bertrand


On Wed, Jul 10, 2013 at 10:05 AM, Nitin Pawar <nitinpawar432@gmail.com>wrote:

> why not?
> you can use sqoop to import to plain text files or avrofiles or squence
> files
> here is one example
>
> sqoop import
>              --connect <conn>
>              --username <user>
>              -P
>              --table <table>
>              --columns "column1,column2,column3,.."
>              --as-textfile
>              --target-dir <hdfs dir>
>              -m 1
>
>
>
> On Wed, Jul 10, 2013 at 1:29 PM, Fatih Haltas <fatih.haltas@nyu.edu>wrote:
>
>> Hi Everyone,
>>
>> I am trying to import data from postgresql to hdfs via sqoop, however,
>> all examples, i got on internet is talking about hive,hbase etc. kind of
>> system,running within hadoop.
>>
>> I am not using, any of these systems, isnt it possible to import data
>> without having those kind of systems,running on hadoop via sqoop?
>>
>> In other words, I am using hadoop and mapreduce systems itself alone, is
>> it possible to import data from postgresql to that basic hadoop system via
>> sqoop?
>>
>
>
>
> --
> Nitin Pawar
>



-- 
Bertrand Dechoux

Mime
View raw message