hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "Release1.0Requirements" by SanjayRadia
Date Thu, 09 Oct 2008 20:27:34 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by SanjayRadia:
http://wiki.apache.org/hadoop/Release1%2e0Requirements

------------------------------------------------------------------------------
  
  ''I think we should be very careful about which network protocols we publicly expose.  Currently
we expose none.  I do not think we should attempt to expose all soon.  A first obvious candidate
to expose might be the job submission protocol.  Before we do so we should closely revisit
its design, since it was not designed as an API with long-term, multi-language access in mind.
 Any logic that we can easily move server-side, we should, to minimize duplicated code.  Etc.
 The HDFS protocols will require more scrutiny, since they involve more client side logic.
 It would be simpler if all of HDFS was implemented using RPC, not a mix of RPC and raw sockets.
 So we might decide to delay publicly exposing the HDFS protocols until we have made that
switch, should it prove feasable.  I think we could reasonably have a 1.0 release that exposed
none.  I do not see this as a gating issue for a 1.0 release.  We could reasonably expose
such protocols in the course of 1.x releases, no?  
 --DougCutting''
  
+ ''I wasn't suggesting making the network protocols public - that would be a serious undertaking
- much more than making an API public. Since HDFS protocols have non-trivial client side library,
 i don't expect anyone to use the protocol directly. I would not want to consider exposing
of the HDFS protocol before 2.0 and maybe much much later. (I am not sure the mix of rpc and
raw makes things much worse ...). (BTW If you recall we recently made DFSClient private -
that was a very deliberate action). So I believe we are in agreement here. My motivation for
language neutral was for direct access via C and a scripting language. I was expecting that
we would supply the client side for these languages and that we would *not* expose the protocol.
I see language neutral marshalling to be a small, first-step, towards a non-java client. --SanjayRadia''
+ 
  === Versioning Scheme - Manual or Automated ===
  Hadoop is likely to see fairly significant changes between 1.0 and 2.0. Given the compatibility
requirements, we need some scheme (manual or automated) for versioning the RPC interfaces
and also for versioning the data types that are passed as parameters to rpc.
  

Mime
View raw message