hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "Release1.0Requirements" by SanjayRadia
Date Thu, 02 Oct 2008 21:35:45 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by SanjayRadia:
http://wiki.apache.org/hadoop/Release1%2e0Requirements

------------------------------------------------------------------------------
  
  == What does Hadoop 1.0 mean? ==
  
+    * Standard release numbering: Only bug fixes in 1.x.y releases and new features in 1.x.0
releases.
     * No need for client recompilation when upgrading from 1.x to 1.y, where x <= y
        * Can't remove deprecated classes or methods until 2.0
     * Old 1.x clients can connect to new 1.y servers, where x <= y
-    * Only bug fixes in 1.x.y releases and new features in 1.x.0 releases.
     * New !FileSystem clients must be able to call old methods when talking to old servers.
This generally will be done by having old methods continue to use old rpc methods. However,
it is legal to have new implementations of old methods call new rpcs methods, as long as the
library transparently handles the fallback case for old servers.
  
  ''Owen, you seem to be extending the ["Roadmap"]'s compatibility requirements to RPC protocols,
is that right?  Clients must be back-compatible with older servers and servers must be back-compatible
with older clients.  If so, perhaps we should vote on this new policy and update ["Roadmap"]
accordingly.  We could even start enforcing it before 1.0, so that, e.g., 0.20's protocols
would need to be back-compatible with 0.19's but 0.21's would not.  --DougCutting''

Mime
View raw message