Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 062451068B for ; Tue, 10 Dec 2013 02:06:32 +0000 (UTC) Received: (qmail 14518 invoked by uid 500); 10 Dec 2013 02:06:29 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 14460 invoked by uid 500); 10 Dec 2013 02:06:29 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 14452 invoked by uid 99); 10 Dec 2013 02:06:29 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 10 Dec 2013 02:06:29 +0000 X-ASF-Spam-Status: No, hits=1.8 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,T_FILL_THIS_FORM_SHORT X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of kylelin2000@gmail.com designates 74.125.82.67 as permitted sender) Received: from [74.125.82.67] (HELO mail-wg0-f67.google.com) (74.125.82.67) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 10 Dec 2013 02:06:21 +0000 Received: by mail-wg0-f67.google.com with SMTP id n12so1301702wgh.2 for ; Mon, 09 Dec 2013 18:06:01 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=LEB39MAkZ1xUSzRYECoBML4EgT+xMYv0iqccbWpiudk=; b=xRxN71GuLP4XbogKIx2w6X1zYPyqfxx+c3y/6B6qOWc0b5i8HN482REpg/8WfcC6ri GztV7TTAJ2eUcpvdwTbhsMruZ2+Do0l74vcUmJB6hrw1MFptljeGh8bVSfsV5EghznTb luBir7KR4Eesw7jYga7psExgot9Z65T5s7ykRKf9OnDSgQXGS5kRp7UDv8zY8GWYqhpW YRcBMmTTgQqt6zKMd74msJQY2XmKErkAh3iJQ1o1G+MprQpwb/vLzopG051tyR7uK2EQ OdIRRrnTU/+qazjSovRETRT7+2muumgjLJQeaElrDd6Oxq5+W7clJBtxAxpJczyqE7ow OPTA== MIME-Version: 1.0 X-Received: by 10.180.189.68 with SMTP id gg4mr17070308wic.46.1386641161495; Mon, 09 Dec 2013 18:06:01 -0800 (PST) Received: by 10.227.128.140 with HTTP; Mon, 9 Dec 2013 18:06:01 -0800 (PST) In-Reply-To: References: Date: Tue, 10 Dec 2013 10:06:01 +0800 Message-ID: Subject: Re: Pig cannot load data using hbasestorage From: Kyle Lin To: user@hbase.apache.org Content-Type: multipart/alternative; boundary=001a11c35300dcaa0904ed248d65 X-Virus-Checked: Checked by ClamAV on apache.org --001a11c35300dcaa0904ed248d65 Content-Type: text/plain; charset=ISO-8859-1 Hello Ted As pig log said "Please look at the previous logs lines from the task's full log for more details", so I try to re-run the scenario to get the same error. After running the imperfect script, I've see all logs under /var/log/hadoop-mapreduce, /var/log/hadoop-yarn....., but cannot see any relative errors in these logs. No logs said I lost some classes, but when I include more jars, the script works fine. Kyle 2013/12/9 Ted Yu > bq. Please look at the previous logs lines from the task's full log for > more details. > > Do you still keep the full log ? > If so, can you pastebin the full log with the detail ? > > Cheers > > > On Mon, Dec 9, 2013 at 5:27 PM, Kyle Lin wrote: > > > Hey > > > > Thanks for your help. But the story not end... > > > > Finally, I use latest Hortonworks Sandbox 2 as my testing > > environment(Pig 0.12.0, HBase 0.96.0). But, got another problem. > > > > My pig script is below (I type "pig -f xx.pig" to run it) > > > > REGISTER /usr/lib/hbase/lib/zookeeper.jar; > > REGISTER /usr/lib/hbase/lib/hbase-*.jar; > > REGISTER /usr/lib/hadoop/hadoop*.jar > > samples = LOAD 'hbase://test' using > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone > > cf:city cf:address') > > as (name, phone, city, address); > > dump samples; > > > > The stack trace > > > > ERROR 1066: Unable to open iterator for alias samples. Backend error : > > java.io.IOException: Cannot create a record reader because of a previous > > error. Please look at the previous logs lines from the task's full log > for > > more details. > > > > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to > > open iterator for alias samples. Backend error : java.io.IOException: > > Cannot create a record reader because of a previous error. Please look at > > the previous logs lines from the task's full log for more details. > > at org.apache.pig.PigServer.openIterator(PigServer.java:870) > > at > > org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:774) > > at > > > > > org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372) > > at > > > > > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198) > > at > > > > > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173) > > at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84) > > at org.apache.pig.Main.run(Main.java:478) > > at org.apache.pig.Main.main(Main.java:156) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at org.apache.hadoop.util.RunJar.main(RunJar.java:212) > > Caused by: java.lang.RuntimeException: java.io.IOException: Cannot > create a > > record reader because of a previous error. Please look at the previous > logs > > lines from the task's full log for more details. > > at > > > > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:266) > > at > > > > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.(PigRecordReader.java:123) > > at > > > > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.createRecordReader(PigInputFormat.java:123) > > at > > > > > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.(MapTask.java:491) > > at > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:734) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339) > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > > > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) > > Caused by: java.io.IOException: Cannot create a record reader because of > a > > previous error. Please look at the previous logs lines from the task's > full > > log for more details. > > at > > > > > org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:119) > > at > > > > > org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:256) > > > > > > Kyle > > > > > > > > 2013/12/9 Ted Yu > > > > > Please use this link to install: > > > http://hortonworks.com/products/hdp-2/#install > > > > > > Cheers > > > > > > > > > On Mon, Dec 9, 2013 at 10:06 AM, Kyle Lin > wrote: > > > > > > > Hello Ted > > > > > > > > Actually I used Hortonworks sandbox 2.0 Beta. Should I get rid > off > > > this > > > > problem by using ambari to install HDP2? > > > > > > > > Kyle > > > > > > > > > > > > 2013/12/9 Ted Yu > > > > > > > > > Kyle: > > > > > According to http://hortonworks.com/products/hdp-2/ , PIG 0.12 > > should > > > be > > > > > used where there is no such problem. > > > > > > > > > > Did you happen to use HDP 2 Beta ? > > > > > > > > > > Cheers > > > > > > > > > > > > > > > On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin > > > wrote: > > > > > > > > > > > Hello Ted > > > > > > > > > > > > Below is the stack trace. May I say that because HBase having > > > > > > removed WritableByteArrayComparable, but pig is not upgrade for > > this > > > > > > change? > > > > > > > > > > > > > > > > > > Pig Stack Trace > > > > > > --------------- > > > > > > ERROR 2998: Unhandled internal error. > > > > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable > > > > > > > > > > > > java.lang.NoClassDefFoundError: > > > > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable > > > > > > at java.lang.Class.forName0(Native Method) > > > > > > at java.lang.Class.forName(Class.java:247) > > > > > > at > > > > > > > > org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415) > > > > > > at > > > > > > > > > > > > org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181) > > > > > > at > > > > > > org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633) > > > > > > at > > > > > > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606) > > > > > > at > > org.apache.pig.PigServer.registerQuery(PigServer.java:565) > > > > > > at > > > > > > > > > > > > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170) > > > > > > at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69) > > > > > > at org.apache.pig.Main.run(Main.java:543) > > > > > > at org.apache.pig.Main.main(Main.java:158) > > > > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > > > Method) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > > > > > at java.lang.reflect.Method.invoke(Method.java:597) > > > > > > at org.apache.hadoop.util.RunJar.main(RunJar.java:212) > > > > > > Caused by: java.lang.ClassNotFoundException: > > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable > > > > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202) > > > > > > at java.security.AccessController.doPrivileged(Native > > Method) > > > > > > at > > java.net.URLClassLoader.findClass(URLClassLoader.java:190) > > > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306) > > > > > > at > > > > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > > > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247) > > > > > > ... 27 more > > > > > > > > > > > > > > > > > > > > > > > > Kyle > > > > > > > > > > > > > > > > > > 2013/12/7 Ted Yu > > > > > > > > > > > > > bq. HDP2(HBase 0.95.2.2.0.5.0-64 > > > > > > > > > > > > > > HDP2 goes with 0.96.0 > > > > > > > > > > > > > > bq. java.lang.ClassNotFoundException: > > > org.apache.hadoop.hbase.filter. > > > > > > > WritableByteArrayComparable. > > > > > > > > > > > > > > Can you show us the stack trace ? > > > > > > > WritableByteArrayComparable doesn't exist in 0.96 and later > > > branches. > > > > > > > > > > > > > > Cheers > > > > > > > > > > > > > > > > > > > > > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy > > > > > > > wrote: > > > > > > > > > > > > > > > Do a register of your hbase and zookeeper jars in the pig > > script. > > > > > > > > > > > > > > > > -Rohini > > > > > > > > > > > > > > > > > > > > > > > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin < > > kylelin2000@gmail.com> > > > > > > wrote: > > > > > > > > > > > > > > > > > Hey there > > > > > > > > > > > > > > > > > > First, my Environment: Hortonworks HDP2(HBase > > > > > 0.95.2.2.0.5.0-64, > > > > > > > Pig > > > > > > > > > 0.11.1). > > > > > > > > > > > > > > > > > > I use pig to load data from hbase, then got Exception > > > Message > > > > > of > > > > > > > > > java.lang.ClassNotFoundException: > > > > > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable. > > > > > > > > > > > > > > > > > > My script is like below: > > > > > > > > > samples = LOAD 'hbase://test' using > > > > > > > > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name > > > > > cf:phone > > > > > > > > > cf:city cf:address') > > > > > > > > > as (name, phone, city, address); > > > > > > > > > dump samples; > > > > > > > > > > > > > > > > > > After googling, people said you need to set > PIG_CLASSPATH > > > > > first. > > > > > > > So I > > > > > > > > > try to add the target jar in PIG_CLASSPATH, but cannot > > > > > > > > > find > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable > > > > in > > > > > > any > > > > > > > > > hbase jars. > > > > > > > > > > > > > > > > > > > > > > > > > > > Kyle > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > --001a11c35300dcaa0904ed248d65--