hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eric Chu <e...@rocketfuel.com>
Subject Re: Hive unit test errors
Date Fri, 06 Sep 2013 23:12:54 GMT
I found out what I missed - I needed to set JAVA_HOME. It'd be good if we
could add that to the unit test documentation. It's not intuitive to me b/c
some tests did run w/out that. After setting that, the unit tests seem to
run fine.

Thanks,

Eric


On Fri, Sep 6, 2013 at 2:27 AM, Eric Chu <echu@rocketfuel.com> wrote:

> It does, and MiniMrShim is defined in HadoopShims.java... Anyway, the
> problem came up after various experiments to get the unit tests pass, so
> maybe at some point something got corrupted.
>
> My main concern, however, is much broader than this - I can never get all
> the unit tests pass for Hive 11. It's much worse than when I upgraded Hive
> to 0.10. I don't know what to expect about these unit tests. Are they
> supposed to all pass if you just run "ant clean package test" on
> branch-0.11?
>
> I tried the following approach: I installed VirtualBox and Vagrant
> (lucid32), and other required programs (ant, git, java, make) for running
> Hive's unit tests, then git clone apache hive, and ran the following on
> branch-0.11:
>
> export ANT_OPTS="-Xms768m -Xmx1024m -XX:PermSize=128m -XX:MaxPermSize=128m"
> ant clean package test -logfile ant.log -Dtest.silent=false
>
> I got lots of errors (attached is the full log) with this approach.  They
> are typically in the form of:
>
> Begin query: alter3.q
>     [junit] Deleted
> file:/home/vagrant/hive/build/ql/test/data/warehouse/alter3_src
>     [junit] /home/vagrant/hive/testutils/hadoop: line 109: /bin/java: No
> such file or directory
>     [junit] /home/vagrant/hive/testutils/hadoop: line 109: exec:
> /bin/java: cannot execute: No such file or directory
>     [junit] Exception: Client Execution failed with error code = 126
>     [junit] See build/ql/tmp/hive.log, or try "ant test ...
> -Dtest.silent=false" to get more logs.
>     [junit] junit.framework.AssertionFailedError: Client Execution failed
> with error code = 126
>     [junit] See build/ql/tmp/hive.log, or try "ant test ...
> -Dtest.silent=false" to get more logs.
> ...
>
> /home/vagrant/hive/testutils/hadoop: line 109: /bin/java: No such file or
> directory
>     [junit] /home/vagrant/hive/testutils/hadoop: line 109: exec:
> /bin/java: cannot execute: No such file or directory
>     [junit] Execution failed with exit status: 126
>     [junit] Obtaining error information
>     [junit]
>     [junit] Task failed!
>     [junit] Task ID:
>     [junit]   null
> ...
>  Logs:
>     [junit]
>     [junit] /home/vagrant/hive/build/ql/tmp/hive.log
>     [junit] testMapPlan1 execution failed with exit status: 126
>     [junit] junit.framework.AssertionFailedError: expected:<true> but
> was:<false>
>     [junit]     at junit.framework.Assert.fail(Assert.java:47)
>
>
> I'm trying to make sure unit tests of 11 pass before I upgrade from 10 to
> 11. So the code is straight from branch-0.11, and I followed the
> instructions from the Hive wiki to run the unit tests. I don't understand
> why I'd get these errors, for so many times. If anyone who has successfully
> gotten unit tests to pass (those who regularly commit), can they share some
> insights? The documentation is painfully insufficient. Also, if I take a
> branch like 0.11, what/how many tests would I expect to pass? (A majority
> of tests fail in my experience)
>
> Thanks so much,
>
> Eric
>
>
> On Wed, Sep 4, 2013 at 11:00 AM, Sushanth Sowmyan <khorgath@gmail.com>wrote:
>
>> This seems to work for me on trunk. This should work in 0.11 as well,
>> given that the feature it is complaining about was introduced in
>> HIVE-4139, which was committed in hive-0.11.
>>
>> Could you please check your /hive/build/shims/hive-shims-0.11.0.jar
>> (should be the same as
>>
>> /Users/echu/.ivy2/local/org.apache.hive/hive-shims/0.11.0/jars/hive-shims.jar
>> ) and run jar tf on it to see if it has the following definitions:
>>
>> org/apache/hadoop/hive/shims/Hadoop20SShims$MiniMrShim.class
>> org/apache/hadoop/hive/shims/Hadoop20Shims$MiniMrShim.class
>> org/apache/hadoop/hive/shims/Hadoop23Shims$MiniMrShim.class
>> org/apache/hadoop/hive/shims/HadoopShims$MiniMrShim.class
>>
>> And failing that, if your codebase has MiniMrShim defined in
>> shims/src/common/java/org/apache/hadoop/hive/shims/HadoopShims.java ?
>>
>> On Tue, Sep 3, 2013 at 2:39 PM, Eric Chu <echu@rocketfuel.com> wrote:
>> > Hi,
>> >
>> > I'm trying to run unit tests on Hive 11 (with a few patches such as
>> 4619,
>> > 4003, 4403, 4900, 3632, 4942) and encountering compile-test errors. I
>> got
>> > the same thing when I ran on Trunk or just the 11 branch. Is there
>> something
>> > I'm missing? Note that I could build and run Hive w/out problems.
>> >
>> > Commands:
>> >     export ANT_OPTS=-XX:MaxPermSize=512M
>> >     ant very-clean package test -logfile ant.log
>> >
>> > Error (Attached is full log):
>> > compile-test:
>> >      [echo] Project: ql
>> >     [javac] Compiling 92 source files to /hive/build/ql/test/classes
>> >     [javac]
>> /hive/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:118:
>> > cannot find symbol
>> >     [javac] symbol  : class MiniMrShim
>> >     [javac] location: interface org.apache.hadoop.hive.shims.HadoopShims
>> >     [javac]   private HadoopShims.MiniMrShim mr = null;
>> >     [javac]                      ^
>> >     [javac]
>> /hive/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java:284:
>> > cannot find symbol
>> >     [javac] symbol  : method
>> >
>> getMiniMrCluster(org.apache.hadoop.hive.conf.HiveConf,int,java.lang.String,int)
>> >     [javac] location: interface org.apache.hadoop.hive.shims.HadoopShims
>> >     [javac]       mr =
>> ShimLoader.getHadoopShims().getMiniMrCluster(conf, 4,
>> > getHdfsUriString(fs.getUri().toString()), 1);
>> >     [javac]                                       ^
>> >     [javac]
>> >
>> /hive/ql/src/test/org/apache/hadoop/hive/ql/exec/TestFunctionRegistry.java:57:
>> > cannot find symbol
>> >     [javac] symbol  : variable decimalTypeInfo
>> >     [javac] location: class
>> > org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory
>> >     [javac]     implicit(TypeInfoFactory.intTypeInfo,
>> > TypeInfoFactory.decimalTypeInfo, true);
>> >
>> > ....
>> >
>> > Thanks,
>> >
>> > Eric
>> >
>>
>
>

Mime
View raw message