hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "Release1.0Requirements" by SanjayRadia
Date Thu, 02 Oct 2008 21:10:23 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by SanjayRadia:
http://wiki.apache.org/hadoop/Release1%2e0Requirements

------------------------------------------------------------------------------
  
  ''Do you mean serialization of user objects, or RPC parameters?  We have support for multi-lingual
user objects already.  As for RPC, I don't yet see the case for forcing 1.0 to wait until
we're ready to support multiple native client libraries. --DougCutting''
  
+ ''I mean RPC parameters. This one is boarderline. We are hurting for not have a language
neutral serialization. Language neutral serialization is 
+ not that hard to do. There are many schemes that do this (Sun RPC, Protocol Buffers, Etch,
Hessian, Thrift). We have to pick rather than invent.''
+ 
+ === Versioning Scheme - Manual or Automated ===
+ Hadoop is likely to see fairly significant changes between 1.0 and 2.0. Given the compatibility
requirements, we need some scheme (manual or automated) for versioning the RPC interfaces
and also for versioning the data types that are passed as parameters to rpc.
+ 
+ ==== Discussion on Manual vs Automated Scheme for Versioning ====
+ A manual scheme is too messy and cumbersome. An automated scheme a la Protocol Buffers,
Etch, Thrift, Hessian should be used.
+ 
  === RPC server that's forward-friendly ===
  
     * RPC mechanism should be able to detect and throw exception on new methods that old
servers clearly don't implement.

Mime
View raw message