ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From vkulichenko <valentin.kuliche...@gmail.com>
Subject Re: Best Practice for Leveraging Ignite to generate large number records files
Date Mon, 31 Aug 2015 23:14:01 GMT
diopek wrote
> We are developing batch application that will eventually be generating
> unordered, large number or records data file around 1GB using
> multi-threads/process. What would be the best practices to accomplish this
> using Ignite. Using ignite data cache with write-behind (file system) flag
> enabled, or should use Ignite cache, one process to write records into
> cache, and another process to read from and remove records from this cache
> or any other suggestion for help enhancing this batch performance.
> Also batch process using Spring Batch partitioning (local/remote) feature,
> I noticed that Spring Batch reference documentation is mentioning GridGain
> as one possible middleware that can be leveraged as grid fabric solution.
> Is there any utility library that can fulfill such SB remote partitioning
> mechanism for Ignite/GridGain?

It sounds like you can use IgniteDataStreamer:
https://apacheignite.readme.io/v1.3/docs/data-streamers. I think you should
implement and configure your own StreamReceiver which will write data to
files. Note that this way you will bypass the cache (i.e. won't save any
data in memory), but entries will still be mapped to nodes by affinity, as
it happens with entries stored in cache. Streamer will automatically do this
mapping as well as batching.

Hope this helps.


View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Best-Practice-for-Leveraging-Ignite-to-generate-large-number-records-files-tp1204p1222.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.

View raw message