incubator-hcatalog-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From agateaaa <agate...@gmail.com>
Subject Hcatalog 0.4.1 and java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors and guava
Date Tue, 06 Nov 2012 03:32:38 GMT
Hi

I am trying to use latest HCatalog 0.4.1 (built from source)  and using pig
0.92 to insert data into hive table using hcatalog


2012-11-06 03:07:18,733 FATAL org.apache.hadoop.mapred.Child: Error running
child : java.lang.NoSuchMethodError:
com.google.common.util.concurrent.MoreExecutors.sameThreadExecutor()Lcom/google/common/util/concurrent/ListeningExecutorService;
      at com.google.common.cache.LocalCache.<clinit>(LocalCache.java:155)
at
com.google.common.cache.LocalCache$LocalManualCache.<init>(LocalCache.java:4750)
 at
com.google.common.cache.LocalCache$LocalManualCache.<init>(LocalCache.java:4745)
at com.google.common.cache.CacheBuilder.build(CacheBuilder.java:757)
 at
org.apache.hcatalog.common.HiveClientCache.<init>(HiveClientCache.java:87)
at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:538)
 at
org.apache.hcatalog.mapreduce.FileOutputCommitterContainer.cancelDelegationTokens(FileOutputCommitterContainer.java:676)
at
org.apache.hcatalog.mapreduce.FileOutputCommitterContainer.internalAbortJob(FileOutputCommitterContainer.java:666)
 at
org.apache.hcatalog.mapreduce.FileOutputCommitterContainer.cleanupJob(FileOutputCommitterContainer.java:201)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputCommitter.cleanupJob(PigOutputCommitter.java:163)
 at
org.apache.hadoop.mapreduce.OutputCommitter.commitJob(OutputCommitter.java:76)
at org.apache.hadoop.mapred.Task.runJobCleanupTask(Task.java:1055)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:357)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
 at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
at org.apache.hadoop.mapred.Child.main(Child.java:249)

Is this because HCATALOG-422 which introduced HiveClient with dependency on
guava

I have these jars in my classpath (added guava-11.0.2.jar to PIG_CLASSPATH)

export HIVE_HOME=<hive-home>
export HCAT_HOME=<hcat-home>
export
PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.1-dev.jar:$HIVE_HOME/lib/hive-metastore-0.9.0.jar:$HIVE_HOME/lib/libthrift-0.7.0.jar:$HIVE_HOME/lib/hive-exec-0.9.0.jar:$HIVE_HOME/lib/libfb303-0.7.0.jar:$HIVE_HOME/lib/jdo2-ap\
i-2.3-ec.jar:$HIVE_HOME/conf:/etc/hadoop:$HIVE_HOME/lib/slf4j-api-1.6.1.jar;/usr/local/myapp/guava-11.0.2.jar
export PIG_OPTS=-Dhive.metastore.uris=thrift://myhivemetastoreserver:10000


pig -Dpig.additional.jars=$PIG_CLASSPATH mypigscript.pig


The job runs to the very end and then throws the error in one of the
tasktrackers. Can anyone seen this before? Would appreciate it if any one
can help?

(Before adding guava jar to the class path the pig job wont even fireup and
failed with a similar error)


Thanks
Agatea

Mime
View raw message