ignite-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anil <anilk...@gmail.com>
Subject Re: Loading Hbase data into Ignite
Date Tue, 11 Oct 2016 15:16:45 GMT
HI Alexey,

We are planning to have 4 node cluster. we will increase the number of
nodes based on performance.

key is string which unique (some part of hbase record primary key which is
unique). Each record has around 25-30 fields but that is small only. Record
wont have much content.

All 18 M records are related to one use case only.. so planning to keep in
single cache so that pagination , filter and sorting supported at cache
level itself.

Initial load will be just write to cache and changes (or new objects) to
existing cache will be added/updated using kafka stream.

Thanks.


On 11 October 2016 at 19:03, Alexey Kuznetsov <akuznetsov@apache.org> wrote:

> Hi, Anil.
>
> It depends on your use case.
> How many nodes will be in your cluster?
> All 18M records will be in one cache or many caches?
> How big single record? What will be the key?
> You need only load or you also need write changed / new objects in cache
> to HBase?
>
> On Tue, Oct 11, 2016 at 8:11 PM, Anil <anilklce@gmail.com> wrote:
>
>> HI,
>>
>> we have around 18 M records in hbase which needs to be loaded into ignite
>> cluster.
>>
>> i was looking at
>>
>> http://apacheignite.gridgain.org/v1.7/docs/data-loading
>>
>> https://github.com/apache/ignite/tree/master/examples
>>
>> is there any approach where each ignite node loads the data of one hbase
>> region ?
>>
>> Do you have any recommendations ?
>>
>> Thanks.
>>
>
>
>
> --
> Alexey Kuznetsov
>

Mime
View raw message