hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peter Lin <wool...@gmail.com>
Subject Re: rules engine with Hadoop
Date Fri, 19 Oct 2012 20:37:51 GMT
embedding a rule engine in map/reduce makes much more sense, but as
Ted points out scaling it isn't easy.

As long as you break the reasoning into map/reduce stages, it should
work. The devil is in the details and you have to write the rules
efficiently to achieve the goal.

On Fri, Oct 19, 2012 at 3:45 PM, Ted Dunning <tdunning@maprtech.com> wrote:
> Unification in a parallel cluster is a difficult problem.  Writing very
> large scale unification programs is an even harder problem.
> What problem are you trying to solve?
> One option would be that you need to evaluate a conventionally-sized
> rulebase against many inputs.  Map-reduce should be trivially capable of
> this.
> Another option would be that you want to evaluate a huge rulebase against a
> few inputs.  It isn't clear that this would be useful given the problems of
> huge rulebases and the typically super-linear cost of resolution algorithms.
> Another option is that you want to evaluate many conventionally-sized
> rulebases against one or many inputs in order to implement a boosted rule
> engine.  Map-reduce should be relatively trivial for this as well.
> What is it that you are trying to do?
> On Fri, Oct 19, 2012 at 12:25 PM, Luangsay Sourygna <luangsay@gmail.com>
> wrote:
>> Hi,
>> Does anyone know any (opensource) project that builds a rules engine
>> (based on RETE) on top Hadoop?
>> Searching a bit on the net, I have only seen a small reference to
>> Concord/IBM but there is barely any information available (and surely
>> it is not open source).
>> Alpha and beta memories would be stored on HBase. Should be possible, no?
>> Regards,
>> Sourygna

View raw message