incubator-hcatalog-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bennett Andrews <benn...@tumblr.com>
Subject Re: Errors in Pig when filtering by partition column
Date Wed, 20 Feb 2013 19:43:20 GMT
Awesome, thanks! I'll give that a shot.


On Wed, Feb 20, 2013 at 2:32 PM, Cheolsoo Park <cheolsoo@cloudera.com>wrote:

> One possible hack you can try is to use jarjar. I've done this by myself,
> and it worked:
>
> http://www.crobak.org/2012/10/using-jarjar-to-solve-hive-and-pig-antlr-conflicts/
>
>
> On Wed, Feb 20, 2013 at 11:19 AM, Bennett Andrews <bennett@tumblr.com>wrote:
>
>> Thanks Cheolsoo.
>>
>> Don't suppose you have any ideas in the mean time? It looks like pig.jar
>> includes antlr runtime (which must be 3.4?).  I tried replacing the Hive
>> client jar with antlr-runtime3.4 but still see the same issue.
>>
>>
>> On Wed, Feb 20, 2013 at 1:44 PM, Cheolsoo Park <cheolsoo@cloudera.com>wrote:
>>
>>> Hi,
>>>
>>> >> java.lang.NoSuchFieldError: type
>>>
>>> This is an antlr conflict. Pig uses antlr v3.4 while Hcatalog uses v3.1
>>> (because of its dependency on Hive).
>>>
>>> >> Using CDH4 and tried HCat 4 & 5
>>>
>>> CDH4.2 will address this issue since both Pig and Hive will use antlr
>>> v3.4.
>>>
>>> Thanks,
>>> Cheolsoo
>>>
>>>
>>>
>>> On Wed, Feb 20, 2013 at 10:28 AM, Bennett Andrews <bennett@tumblr.com>wrote:
>>>
>>>> I seem to be getting this error whenever I try and FILTER by a
>>>> partition column in Pig such as:
>>>>
>>>> raw = LOAD <table> USING org.apache.hcatalog.pig.HCatLoader();
>>>> filtered = FILTER raw BY dt == <partition>;
>>>>
>>>> Without the partition, things seem to work fine.  Using CDH4 and tried
>>>> HCat 4 & 5.
>>>>
>>>> Any suggestions on what the issue may be?
>>>>
>>>> Thanks
>>>>
>>>> Pig Stack Trace
>>>> ---------------
>>>> ERROR 2998: Unhandled internal error. type
>>>>
>>>> java.lang.NoSuchFieldError: type
>>>>  at
>>>> org.apache.hadoop.hive.metastore.parser.FilterLexer.mLPAREN(FilterLexer.java:112)
>>>> at
>>>> org.apache.hadoop.hive.metastore.parser.FilterLexer.mTokens(FilterLexer.java:665)
>>>>  at org.antlr.runtime.Lexer.nextToken(Lexer.java:89)
>>>> at
>>>> org.antlr.runtime.BufferedTokenStream.fetch(BufferedTokenStream.java:133)
>>>>  at
>>>> org.antlr.runtime.BufferedTokenStream.sync(BufferedTokenStream.java:127)
>>>> at org.antlr.runtime.CommonTokenStream.setup(CommonTokenStream.java:132)
>>>>  at org.antlr.runtime.CommonTokenStream.LT(CommonTokenStream.java:91)
>>>> at org.antlr.runtime.BufferedTokenStream.LA
>>>> (BufferedTokenStream.java:162)
>>>>  at
>>>> org.apache.hadoop.hive.metastore.parser.FilterParser.expression(FilterParser.java:206)
>>>> at
>>>> org.apache.hadoop.hive.metastore.parser.FilterParser.andExpression(FilterParser.java:152)
>>>>  at
>>>> org.apache.hadoop.hive.metastore.parser.FilterParser.orExpression(FilterParser.java:96)
>>>> at
>>>> org.apache.hadoop.hive.metastore.parser.FilterParser.filter(FilterParser.java:70)
>>>>  at
>>>> org.apache.hadoop.hive.metastore.ObjectStore.getFilterParser(ObjectStore.java:1598)
>>>> at
>>>> org.apache.hadoop.hive.metastore.ObjectStore.makeQueryFilterString(ObjectStore.java:1627)
>>>>  at
>>>> org.apache.hadoop.hive.metastore.ObjectStore.listMPartitionsByFilter(ObjectStore.java:1690)
>>>> at
>>>> org.apache.hadoop.hive.metastore.ObjectStore.getPartitionsByFilter(ObjectStore.java:1581)
>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>  at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>  at
>>>> org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
>>>> at $Proxy11.getPartitionsByFilter(Unknown Source)
>>>>  at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_partitions_by_filter(HiveMetaStore.java:2466)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.listPartitionsByFilter(HiveMetaStoreClient.java:691)
>>>>  at
>>>> org.apache.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:112)
>>>> at
>>>> org.apache.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:85)
>>>>  at
>>>> org.apache.hcatalog.mapreduce.HCatInputFormat.setFilter(HCatInputFormat.java:108)
>>>> at org.apache.hcatalog.pig.HCatLoader.setLocation(HCatLoader.java:119)
>>>>  at
>>>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:380)
>>>> at
>>>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:259)
>>>>  at
>>>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:180)
>>>> at org.apache.pig.PigServer.launchPlan(PigServer.java:1275)
>>>>  at
>>>> org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1260)
>>>> at org.apache.pig.PigServer.storeEx(PigServer.java:957)
>>>>  at org.apache.pig.PigServer.store(PigServer.java:924)
>>>> at org.apache.pig.PigServer.openIterator(PigServer.java:837)
>>>>  at
>>>> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:682)
>>>> at
>>>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:303)
>>>>  at
>>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:189)
>>>> at
>>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:165)
>>>>  at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
>>>> at org.apache.pig.Main.run(Main.java:490)
>>>> at org.apache.pig.Main.main(Main.java:111)
>>>>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>  at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>  at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>>>
>>>
>>>
>>
>

Mime
View raw message