hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From java8964 <>
Subject RE: Hive trunk unit test failed
Date Thu, 27 Feb 2014 01:06:27 GMT
OK. Now I understand that this error is due to missing the Hadoop native library.
If I manually add "" into java.library.path for this unit test, it passed.
So either the hadoop 2.2.0 coming from Maven reponsitory includes 32bit of hadoop native library,
or totally missed it.
Now the question is what is the correct way to run the unit tests in the new maven build?


Subject: Hive trunk unit test failed
Date: Wed, 26 Feb 2014 14:49:41 -0500

I tried to run the all tests in my local Linux x64 of current Hive trunk code. 
My "mvn clean package -DskipTests -Phadoop-2 -Pdist" will work fine if I skip tests.
The following unit test failed, and then it stopped.
I traced the code down to a native method invoked at"
Method)" throw InvocationTargetException.
My questions are:
1) Did it mean the native code not available in my environment causing the above error?2)
If so, since the latest hive build is using Maven, and I can see the hadoop-2.2.0 all jar
files downloaded in my local repository, why this error still happen?3) Is it possible that
because of my local environment is 64bit, but default hadoop-2.2.0 coming with 32bit native
code? If so, how to fix that during the hive build?
Running org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtilsTests run: 8, Failures: 0, Errors:
1, Skipped: 0, Time elapsed: 0.802 sec <<< FAILURE! - in org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtilsdetemineSchemaTriesToOpenUrl(org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils)
 Time elapsed: 0.377 sec  <<< ERROR!java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at Method)	at<clinit>(
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)	at sun.reflect.NativeConstructorAccessorImpl.newInstance(
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
at java.lang.reflect.Constructor.newInstance(	at org.apache.hadoop.util.ReflectionUtils.newInstance(
at<init>(	at
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(
at org.apache.hadoop.fs.FileSystem$Cache.get(	at org.apache.hadoop.fs.FileSystem.get(
at org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.getSchemaFromFS(
at org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.determineSchemaOrThrowException(
at org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils.detemineSchemaTriesToOpenUrl(
View raw message