hadoop-hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Edward Capriolo (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HIVE-487) Hive does not compile with Hadoop 0.20.0
Date Thu, 23 Jul 2009 20:57:14 GMT

    [ https://issues.apache.org/jira/browse/HIVE-487?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12734781#action_12734781

Edward Capriolo commented on HIVE-487:

@Todd - Where were you a few weeks ago? :)

Then in the build product we could simply include two jars and have the wrapper scripts swap
between them based on version
The jars are upstream in Hadoop core. I did not look into this closely but the talk about
'Sealing exceptions' above led me to believe I should not try this.

I have wrapped my head around most of your Dynamic Proxy idea. My only concern is will the
ant process cooperate? Will eclipse think the HWI classes are broken? Can we translate your
run.sh into something ant/eclipse can deal with?

public class WebServer {
  public void someMethod(String arg) {
    System.out.println("Webserver v1: " + arg);

I really don't want to have one 'someMethod' per each Jetty method. Just start(), stop(),
init(). I like your implementation, but this is such a 'hacky' thing, I wonder is it worth
thinking that hard? Hopefully the Jetty crew will be happy with their API for the next few
years. Hopefully, we will not be supporting Hadoop 0.17.0 indefinitely. Honestly all that
reflection has me 'burnt out'.

If you/we can tackle the ant/eclipse issues I would be happy to use the 'Dynamic Proxy', but
maybe we tackle it in a different Jira because this is a pretty big blocker and I am sure
many people want to see this in the trunk. 

> Hive does not compile with Hadoop 0.20.0
> ----------------------------------------
>                 Key: HIVE-487
>                 URL: https://issues.apache.org/jira/browse/HIVE-487
>             Project: Hadoop Hive
>          Issue Type: Bug
>    Affects Versions: 0.3.0
>            Reporter: Aaron Kimball
>            Assignee: Justin Lynn
>             Fix For: 0.4.0
>         Attachments: dynamic-proxy.tar.gz, HIVE-487-2.patch, hive-487-jetty-2.diff, hive-487-jetty.patch,
hive-487.3.patch, hive-487.4.patch, HIVE-487.patch, jetty-patch.patch, junit-patch1.html
> Attempting to compile Hive with Hadoop 0.20.0 fails:
> aaron@jargon:~/src/ext/svn/hive-0.3.0$ ant -Dhadoop.version=0.20.0 package
> (several lines elided)
> compile:
>      [echo] Compiling: hive
>     [javac] Compiling 261 source files to /home/aaron/src/ext/svn/hive-0.3.0/build/ql/classes
>     [javac] /home/aaron/src/ext/svn/hive-0.3.0/build/ql/java/org/apache/hadoop/hive/ql/exec/ExecDriver.java:94:
cannot find symbol
>     [javac] symbol  : method getCommandLineConfig()
>     [javac] location: class org.apache.hadoop.mapred.JobClient
>     [javac]       Configuration commandConf = JobClient.getCommandLineConfig();
>     [javac]                                            ^
>     [javac] /home/aaron/src/ext/svn/hive-0.3.0/build/ql/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java:241:
cannot find symbol
>     [javac] symbol  : method validateInput(org.apache.hadoop.mapred.JobConf)
>     [javac] location: interface org.apache.hadoop.mapred.InputFormat
>     [javac]       inputFormat.validateInput(newjob);
>     [javac]                  ^
>     [javac] Note: Some input files use or override a deprecated API.
>     [javac] Note: Recompile with -Xlint:deprecation for details.
>     [javac] Note: Some input files use unchecked or unsafe operations.
>     [javac] Note: Recompile with -Xlint:unchecked for details.
>     [javac] 2 errors
> /home/aaron/src/ext/svn/hive-0.3.0/build.xml:145: The following error occurred while
executing this line:
> /home/aaron/src/ext/svn/hive-0.3.0/ql/build.xml:135: Compile failed; see the compiler
error output for details.

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message