incubator-hcatalog-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Travis Crawford <traviscrawf...@gmail.com>
Subject Re: Hcatalog 0.4.1 and java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors and guava
Date Tue, 06 Nov 2012 18:06:01 GMT
Yeah looking in
https://svn.apache.org/repos/asf/pig/branches/branch-0.9/ivy/libraries.properties
we see pig 0.9 uses guava 11, so curious this is an issue.

Were you able to try adding the HCatalog jars through register
statements, instead of putting on the classpath? We use the following:

register /usr/lib/hive/lib/{hive-metastore-*-*,hive-exec-*-*,libfb303-*}.jar;
register /usr/lib/hcatalog/share/hcatalog/hcatalog-{core,pig-adapter}-*-*.jar;

You might have to change the globs to match your environment but this
gives the general idea.

--travis


On Tue, Nov 6, 2012 at 9:59 AM, agateaaa <agateaaa@gmail.com> wrote:
> Thanks for your reply.
>
> I double checked all jars on my box and see guava classes bundled up in the
> pig without hadoop jar (pig 0.9.2) - but as far as I can
> tell they appear to be guava 11.02 - though I may be wrong here dont know
> how I can confirm that - if any one has any suggestions
>
> I tried switching to pig 0.10.0 to make but still get
> java.lang.NoSuchMethodError:
> com.google.common.util.concurrent.MoreExecutors.sameThreadExecutor()Lcom/google/common/util/concurrent/ListeningExecutorService;
>
>
>
>
> On Mon, Nov 5, 2012 at 7:38 PM, Travis Crawford <traviscrawford@gmail.com>
> wrote:
>>
>> Seeing NoSuchMethodError instead of a missing class feels like you may
>> have multiple versions of guava on your classpath, and the wrong one
>> is being used. In your particular setup would you expect to have guava
>> already on the classpath?
>>
>> I don't think this matters, but you shouldn't have to put any jars on
>> your classpath – everything can be registered in the pig script
>> itself. I recommend that approach as it embeds the script jar
>> dependencies in the script itself, rather than depending on a custom
>> environment.
>>
>> --travis
>>
>>
>>
>> On Mon, Nov 5, 2012 at 7:32 PM, agateaaa <agateaaa@gmail.com> wrote:
>> > Hi
>> >
>> > I am trying to use latest HCatalog 0.4.1 (built from source)  and using
>> > pig
>> > 0.92 to insert data into hive table using hcatalog
>> >
>> >
>> > 2012-11-06 03:07:18,733 FATAL org.apache.hadoop.mapred.Child: Error
>> > running
>> > child : java.lang.NoSuchMethodError:
>> >
>> > com.google.common.util.concurrent.MoreExecutors.sameThreadExecutor()Lcom/google/common/util/concurrent/ListeningExecutorService;
>> >       at
>> > com.google.common.cache.LocalCache.<clinit>(LocalCache.java:155)
>> > at
>> >
>> > com.google.common.cache.LocalCache$LocalManualCache.<init>(LocalCache.java:4750)
>> > at
>> >
>> > com.google.common.cache.LocalCache$LocalManualCache.<init>(LocalCache.java:4745)
>> > at com.google.common.cache.CacheBuilder.build(CacheBuilder.java:757)
>> > at
>> >
>> > org.apache.hcatalog.common.HiveClientCache.<init>(HiveClientCache.java:87)
>> > at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:538)
>> > at
>> >
>> > org.apache.hcatalog.mapreduce.FileOutputCommitterContainer.cancelDelegationTokens(FileOutputCommitterContainer.java:676)
>> > at
>> >
>> > org.apache.hcatalog.mapreduce.FileOutputCommitterContainer.internalAbortJob(FileOutputCommitterContainer.java:666)
>> > at
>> >
>> > org.apache.hcatalog.mapreduce.FileOutputCommitterContainer.cleanupJob(FileOutputCommitterContainer.java:201)
>> > at
>> >
>> > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputCommitter.cleanupJob(PigOutputCommitter.java:163)
>> > at
>> >
>> > org.apache.hadoop.mapreduce.OutputCommitter.commitJob(OutputCommitter.java:76)
>> > at org.apache.hadoop.mapred.Task.runJobCleanupTask(Task.java:1055)
>> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:357)
>> > at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> > at java.security.AccessController.doPrivileged(Native Method)
>> > at javax.security.auth.Subject.doAs(Subject.java:396)
>> > at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
>> > at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> >
>> > Is this because HCATALOG-422 which introduced HiveClient with dependency
>> > on
>> > guava
>> >
>> > I have these jars in my classpath (added guava-11.0.2.jar to
>> > PIG_CLASSPATH)
>> >
>> > export HIVE_HOME=<hive-home>
>> > export HCAT_HOME=<hcat-home>
>> > export
>> >
>> > PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.1-dev.jar:$HIVE_HOME/lib/hive-metastore-0.9.0.jar:$HIVE_HOME/lib/libthrift-0.7.0.jar:$HIVE_HOME/lib/hive-exec-0.9.0.jar:$HIVE_HOME/lib/libfb303-0.7.0.jar:$HIVE_HOME/lib/jdo2-ap\
>> >
>> > i-2.3-ec.jar:$HIVE_HOME/conf:/etc/hadoop:$HIVE_HOME/lib/slf4j-api-1.6.1.jar;/usr/local/myapp/guava-11.0.2.jar
>> > export
>> > PIG_OPTS=-Dhive.metastore.uris=thrift://myhivemetastoreserver:10000
>> >
>> >
>> > pig -Dpig.additional.jars=$PIG_CLASSPATH mypigscript.pig
>> >
>> >
>> > The job runs to the very end and then throws the error in one of the
>> > tasktrackers. Can anyone seen this before? Would appreciate it if any
>> > one
>> > can help?
>> >
>> > (Before adding guava jar to the class path the pig job wont even fireup
>> > and
>> > failed with a similar error)
>> >
>> >
>> > Thanks
>> > Agatea
>> >
>> >
>
>

Mime
View raw message