avro-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Scott Carey (JIRA)" <j...@apache.org>
Subject [jira] Issue Comment Edited: (AVRO-647) Break avro.jar into avro.jar, avro-dev.jar and avro-hadoop.jar
Date Tue, 21 Sep 2010 21:03:33 GMT

    [ https://issues.apache.org/jira/browse/AVRO-647?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12913250#action_12913250
] 

Scott Carey edited comment on AVRO-647 at 9/21/10 5:02 PM:
-----------------------------------------------------------

bq. Which classes are you thinking of?

ByteBufferInputStream and ByteBufferOutputStream are used by BinaryDecoder and BinaryEncoder
and we should consider moving them to util or io.
AvroRemoteException is referenced in many places as well.  

{quote}
Generic, specific and reflect all depend on ipc for Requestor and Responder. The complicated
bit is that ipc depends on the specific compiler for Handshake{Request,Response}. So perhaps
{Generic,Specific,Reflect}{Requestor,Responder} should all move to ipc to remove that circularity.
That would make the build easier.
{quote}

In order to make a 'core' library I moved Requestor and Responder to avro-ipc.  It was the
cleanest break that allowed the Generic/Specific/Reflect API to otherwise remain.

Moving them all to ipc doesn't remove the circularity, you still can't build Requestor/Responder
without first building SpecificCompiler and generating classes.   With Specific in 'core'
ant tasks / maven plugins for the SpecificCompiler can be built off of core, and then ipc
can be built after generating the classes that Requestor/Responder need using the just-built
ant/maven tool.

Unless we figure out how to extract the dependency on generated code in Requestor/Responder
(wrappers?), it looks like we have to build the SpecificCompiler before Requestor/Responder.



      was (Author: scott_carey):
    bq. Which classes are you thinking of?

ByteBufferInputStream and ByteBufferOutputStream are used by BinaryDecoderEncoder and we should
consider moving them to util or io.
AvroRemoteException is referenced in many places as well.  

{quote}
Generic, specific and reflect all depend on ipc for Requestor and Responder. The complicated
bit is that ipc depends on the specific compiler for Handshake{Request,Response}. So perhaps
{Generic,Specific,Reflect}{Requestor,Responder} should all move to ipc to remove that circularity.
That would make the build easier.
{quote}

In order to make a 'core' library I moved Requestor and Responder to avro-ipc.  It was the
cleanest break that allowed the Generic/Specific/Reflect API to otherwise remain.

Moving them all to ipc doesn't remove the circularity, you still can't build Requestor/Responder
without first building SpecificCompiler and generating classes.   With Specific in 'core'
ant tasks / maven plugins for the SpecificCompiler can be built off of core, and then ipc
can be built after generating the classes that Requestor/Responder need using the just-built
ant/maven tool.

Unless we figure out how to extract the dependency on generated code in Requestor/Responder
(wrappers?), it looks like we have to build the SpecificCompiler before Requestor/Responder.


  
> Break avro.jar into avro.jar, avro-dev.jar and avro-hadoop.jar
> --------------------------------------------------------------
>
>                 Key: AVRO-647
>                 URL: https://issues.apache.org/jira/browse/AVRO-647
>             Project: Avro
>          Issue Type: Improvement
>          Components: java
>            Reporter: Scott Carey
>            Assignee: Scott Carey
>
> Our dependencies are starting to get a little complicated on the Java side.
> I propose we build two (possibly more) jars related to our major dependencies and functions.
> 1. avro.jar  (or perhaps avro-core.jar)
> This contains all of the core avro functionality for _using_ avro as a library.  This
excludes the specific compiler, avro idl, and other build-time or development tools, as well
as avro packages for third party integration such as hadoop.  This jar should then have a
minimal set of dependencies (jackson, jetty, SLF4J ?).
> 2. avro-dev.jar
> This would contain compilers, idl, development tools, etc.  Most applications will not
need this, but build systems and developers will.
> 3. avro-hadoop.jar
> This would contain the hadoop API and possibly pig/hive/whatever related to that.  This
makes it easier for pig/hive/hadoop to consume avro-core without circular dependencies. 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message