hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jay Vyas <jayunit100.apa...@gmail.com>
Subject Re: HDFS-based database for Big and Small data?
Date Sat, 03 Jan 2015 15:50:30 GMT
1)  Phoenix can be used on top of hbase for richer querying semantics. That combo might be
good for complex workloads.

2) SolrCloud also might fit the bill here ? 

Solr can be backed by any HAdoop compatible FS including HDFS, and it's resiliant by that
mechanism, and offers sophisticated indexing and searching options.

Although the querying is limited...

> On Jan 3, 2015, at 9:39 AM, Wilm Schumacher <wilm.schumacher@gmail.com> wrote:
>> Am 03.01.2015 um 08:44 schrieb Alec Taylor:
>> Want to replace MongoDB with an HDFS-based database in my architecture.
>> Note that this is a new system, not a rewrite of an old one.
>> Are there any open-source "fast" read/write database built on HDFS
> yeah. As Ted wrote: hbase.
>> with a model similar to a document-store,
> well, then PERHAPS hbase isn't the right choice. What exactly do you
> need from the definition of a "doc-store"? If you e.g. rely highly on ad
> hoc queries or secondary indexes then perhaps hbase could lead to some
> additional work for you.
>> that can hold my regular
>> business logic and enables an object model in Python? (E.g.: via Data
>> Mapper or Active Record patterns)
> in addition to Teds link, you could also use thrift, if this is enough
> control for you. Depends on your requirement.
> Best wishes,
> Wilm

View raw message