hawq-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jim Campbell <jacampb...@pivotal.io>
Subject RE: Query on Data Extraction from HAWQ
Date Thu, 20 Apr 2017 14:59:09 GMT
You place a copy of gpfdist on the system that you want to write the file.
You will need to start the service and use the options as explained in the
documentation.  On the server, you need to create a writable external table
with the correct options for your application.  From there, it is an
insert/select statement.

James Campbell
Data Engineer
Pivotal Software
P:  571-247-6511
E:  jacampbell@pivotal.io



On April 20, 2017 at 10:15:24 AM, Joshi Bhanu (bhanu.joshi@lntinfotech.com)
wrote:

Hi,

I need to move the data to another file system. Extract from HAWQ to
downstream 3rd party system.

So I can use gpfdist and it will extract the data and can be moved to an
sftp server.



Regards,

Bhanu



*From:* Jim Campbell [mailto:jacampbell@pivotal.io]
*Sent:* Thursday, April 20, 2017 7:17 PM
*To:* user@hawq.incubator.apache.org; Bhanu Joshi <
Bhanu.Joshi@lntinfotech.com>
*Subject:* Re: Query on Data Extraction from HAWQ



Do you want the flat file placed on HDFS, or do you want it on another file
system?



In both cases you can use writeable external tables.  The documentation
provides examples of doing this multiple ways. Once you create an external
table, you do an insert-select to place the data into the external table.
If you want the file on HDFS, you will use one of the PXF options.  On
another file system, you could use gpfdist.



Of course, you can always use the command line tool psql and select the
data into a file.



James Campbell
Data Engineer
Pivotal Software
P:  571-247-6511
E:  jacampbell@pivotal.io







On April 20, 2017 at 9:16:21 AM, Joshi Bhanu (bhanu.joshi@lntinfotech.com)
wrote:

Hi.

I am new to HAWQ. I have a question.

Can you please tell me different ways I can extract the data from HAWQ to
flat files?

I understand HAWQ also starts the data files in HDFS.





Regards,

Bhanu Joshi

Mime
View raw message