Hello!

Well, COPY command does allow you to do column mapping:

COPY FROM '/path/to/local/file.csv'
INTO tablename (columnName, columnName, ...) FORMAT CSV

If you need to do non-trivial transformations, you can use JDBC driver in SET STREAMING ON mode.

Regards,
--
Ilya Kasnacheev


вт, 29 окт. 2019 г. в 13:11, Muhammed Favas <favas.muhammed@expeedsoftware.com>:

Hi,

 

I have tried simple python program without using spark. First I read whole csv into python dataframe using pandas library.

Now I want to bulk insert the whole dataframe into ignite table without looping through.

 

The purpose of this test is to evaluate the best way(means faster wat) to bulk load csv files into ignite.

 

Ignite COPY command I can not use here, because I need an option to do column mapping while import csv files.

 

 

Regards,

Favas 

 

From: Stephen Darlington <stephen.darlington@gridgain.com>
Sent: Monday, October 28, 2019 5:05 PM
To: user@ignite.apache.org
Subject: Re: Write python dataframe to ignite table.

 

What have you tried? As long as your command-line includes the right JAR files it seems to more-or-less just work for me:

 

https://medium.com/@sdarlington/the-trick-to-successfully-integrating-apache-ignite-and-pyspark-890e436d09ba

 

Regards,

Stephen



On 22 Oct 2019, at 11:41, Muhammed Favas <favas.muhammed@expeedsoftware.com> wrote:

 

Hi,

 

Is there a way to bulk load python dataframe values to ignite table?

 

Regards,

Favas