hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "Release1.0Requirements" by DougCutting
Date Thu, 02 Oct 2008 21:38:29 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by DougCutting:
http://wiki.apache.org/hadoop/Release1%2e0Requirements

------------------------------------------------------------------------------
  === Versioning Scheme - Manual or Automated ===
  Hadoop is likely to see fairly significant changes between 1.0 and 2.0. Given the compatibility
requirements, we need some scheme (manual or automated) for versioning the RPC interfaces
and also for versioning the data types that are passed as parameters to rpc.
  
+ ''We already have manually maintained versions for protocols.  Automated versions will make
some things simpler (e.g., marshalling) but won't solve the harder back-compatibility problems.
 We could manually version data types independently of the protocols by adding a 'version'
field to classes, as is done in [http://svn.apache.org/viewvc/lucene/nutch/trunk/src/java/org/apache/nutch/crawl/CrawlDatum.java?view=markup
Nutch] (search for readFields method), but that method doesn't gracefully handle old code
receiving new instances.  A way to handle that is to similarly update the write() method to
use a format compatible with the client's protocol version.  Regardless of how we version
RPC, we need to add tests against older versions. --DougCutting''
+ 
  ==== Discussion on Manual vs Automated Scheme for Versioning ====
  A manual scheme is too messy and cumbersome. An automated scheme a la Protocol Buffers,
Etch, Thrift, Hessian should be used.
  

Mime
View raw message