hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "Release1.0Requirements" by DougCutting
Date Thu, 09 Oct 2008 21:03:00 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by DougCutting:

  ''I am not sure if we as community can start enforcing protocol compatibility right away
because in order to do that we would have to version the rpc parameters. One we version the
data types we can change the policy and enforce it.
  However we can easily support backward compatibility when new methods are added. I am not
sure if this subset can be documented in the
  backward-compatibility policy (more correctly I don't know how to word it); however I will
file a jira to change the rpc layer to detect and throw exceptions on missing methods. I will
also follow that up with a patch for that jira. This will allow is us to treat the addition
of a new method as a backward compatible changes that does not require the version # to be
+ ''Sanjay, about versioning RPC parameters: On the mailing list I proposed a mechanism that,
with a small change to only the RPC mechanism itself, we could start manually versioning parameters
as they are modified.  Under this proposal, existing parameters implementations would not
need to be altered until they next change incompatibly.  It's perhaps not the best long-term
solution, but it would, if we wanted, permit us to start requiring back-compatible protocols
soon. --DougCutting''
  === Time frame for 1.0 to 2.0 ===
  What is the expectation for life of 1.0 before it goes to 2.0 Clearly if we switch from
1.0 to 2.0 in 3 months the compatibility benefit of 1.0 does not deliver much value for Hadoop
customers. A time frame of 12 months is probably the minimum.

View raw message