incubator-cassandra-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jonathan Ellis <jbel...@gmail.com>
Subject Re: A proposed use case, any comments and experience is appreciated
Date Mon, 04 Oct 2010 13:30:54 GMT
A simpler approach might be to insert expiring columns into a 2nd CF
with a TTL of one hour.

On Mon, Oct 4, 2010 at 5:12 AM, Utku Can Top├žu <utku@topcu.gen.tr> wrote:
> Hey All,
>
> I'm planning to run Map/Reduce on one of the ColumnFamilies. The keys are
> formed in such a fashion that, they are indexed in descending order by time.
> So I'll be analyzing the data for every hour iteratively.
>
> Since the current Hadoop integration does not support partial columnfamily
> analysis. I feel that, I'll need to dump the data of the last hour and put
> it to the hadoop cluster and do my analysis on the flat text file.
> Do you think of any other "better" way of getting the data of a keyrange
> into a hadoop cluster for analysis?
>
> Regards,
>
> Utku
>
>
>



-- 
Jonathan Ellis
Project Chair, Apache Cassandra
co-founder of Riptano, the source for professional Cassandra support
http://riptano.com

Mime
View raw message