hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mich Talebzadeh <>
Subject Re: Copying all Hive tables from Prod to UAT
Date Thu, 26 May 2016 09:01:45 GMT
That is a good point Jorn with regard to JDBC and Hive data

I believe you can use JDBC to get a compressed data from an Oraclle or
Sybase database cause decompression happens at the time of data access much
like using a sqlplus or isql tool.

However, it is worth trying what happens when one accesses Hive data
through JDBC where the underlying table is compress using bzip2 or snappy

If this is oone-off request say copy all table from certain DB in Hive in
Prod to UAT, I am not sure replication will be suitable as the request is
for a snapshot.

EXPORT/IMPORT through NAS or scp should be an option. NAS is better as it
saves scp and copy across with taget having enough external space to get
the files in.

More useful tool would be to export the full Hive database in binary format
and import it in target.


Dr Mich Talebzadeh

LinkedIn *

On 26 May 2016 at 07:28, Elliot West <> wrote:

> Hello,
> I've been looking at this recently for moving Hive tables from on-premise
> clusters to the cloud, but the principle should be the same for your
> use-case. If you wish to do this in an automated way, some tools worth
> considering are:
>    - Hive's built in replication framework:
>    - Hive's IMPORT/EXPORT primitives:
>    - AirBnB's ReAir replication tool:
> Elliot.
> On 8 April 2016 at 23:24, Ashok Kumar <> wrote:
>> Hi,
>> Anyone has suggestions how to create and copy Hive and Spark tables from
>> Production to UAT.
>> One way would be to copy table data to external files and then move the
>> external files to a local target directory and populate the tables in
>> target Hive with data.
>> Is there an easier way of doing so?
>> thanks

View raw message