incubator-cassandra-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eric Evans <eev...@rackspace.com>
Subject Re: BinaryMemtable
Date Fri, 03 Apr 2009 13:58:39 GMT
Avinash Lakshman wrote:
> That is what we used to load large amounts of data into Cassandra using M/R.
> So we loaded around 12TB of data from Hadoop into Cassandra before we
> launched Inbox Search. This way we could do all the heavylifting in Hadoop
> and load data at practically network bandwidth 100 MB/sec. Going the normal
> route with the same load chewed up lot of CPU resources on the Cassandra
> servers because of lot of serialization/deserialization.

Avinash, do you have any sample code that demonstrates importing data
this way?

-- 
Eric Evans
eevans@rackspace.com

Mime
View raw message