Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5039110FC2 for ; Fri, 2 Aug 2013 01:45:08 +0000 (UTC) Received: (qmail 12672 invoked by uid 500); 2 Aug 2013 01:45:03 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 12566 invoked by uid 500); 2 Aug 2013 01:45:03 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 12559 invoked by uid 99); 2 Aug 2013 01:45:02 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 Aug 2013 01:45:02 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of justlooks@gmail.com designates 209.85.219.49 as permitted sender) Received: from [209.85.219.49] (HELO mail-oa0-f49.google.com) (209.85.219.49) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 02 Aug 2013 01:44:58 +0000 Received: by mail-oa0-f49.google.com with SMTP id n10so196413oag.8 for ; Thu, 01 Aug 2013 18:44:37 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=mJocox1YFJfP0S1XvKvsxJA9XEIJBvX7KHUv8iXpAm0=; b=nsh0VAUzv5wSF0hbtMrms76tWwthcUqF1hC8Nye8mC4fs6sk1UxRVY9UfwTz01z46k zmGwhlxCDoV8AFJ4h/r/ImZDmtm14CvGQnNwyYVJbRltNRV800gsw8BeR793Cn8/VLZw 6Xuvj2y2m/iaksBwRd6zyJTVPPG4UFyudqSJ6sk7/JLNP7kRmX6Db6tQWhGvrEVkgx90 DYJtGCai9IY5LNVZ+oIxLBKalop+Za1X2clljUEUBzXiR3VFOhM9ADqAHL77eKb201YD dGEEznQsS3n3nB9wJn15JQmE6V/5jFPDx0ex++uCvKLTrdNcutxxG+CcPqkGPOJimKl6 PrZg== MIME-Version: 1.0 X-Received: by 10.182.120.132 with SMTP id lc4mr3544473obb.22.1375407877703; Thu, 01 Aug 2013 18:44:37 -0700 (PDT) Received: by 10.182.103.41 with HTTP; Thu, 1 Aug 2013 18:44:37 -0700 (PDT) In-Reply-To: References: Date: Fri, 2 Aug 2013 09:44:37 +0800 Message-ID: Subject: Re: test lzo problem in hadoop From: ch huang To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=089e0111bd80f8cff304e2ed1995 X-Virus-Checked: Checked by ClamAV on apache.org --089e0111bd80f8cff304e2ed1995 Content-Type: text/plain; charset=ISO-8859-1 i use yarn ,and i commented the following option and error is different vi /etc/hadoop/conf/mapred-site.xml # hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo 13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library 13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8] 13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file /alex/ttt.lzo to indexing list (no index currently exists) 13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id 13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= 13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated. Instead, use dfs.datanode.hostname 13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process : 1 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in config null 13/08/02 09:25:53 INFO mapred.JobClient: Running job: job_local180628093_0001 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is com.hadoop.mapreduce.LzoIndexOutputFormat$1 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task: attempt_local180628093_0001_m_000000_0 13/08/02 09:25:53 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0 13/08/02 09:25:53 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3 13/08/02 09:25:53 INFO mapred.MapTask: Processing split: hdfs://CH22:9000/alex/ttt.lzo:0+306 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete. 13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001 java.lang.Exception: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404) Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected at com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330) at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) 13/08/02 09:25:54 INFO mapred.JobClient: map 0% reduce 0% 13/08/02 09:25:54 INFO mapred.JobClient: Job complete: job_local180628093_0001 13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0 On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri wrote: > Try this Ccommand > hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar > com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo > > > On Mon, Jul 22, 2013 at 2:08 PM, ch huang wrote: > >> anyone can help? >> >> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar >> com.hadoop.compression.lzo.DistributedLzoIndexer >> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo >> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library >> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized >> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8] >> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file >> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no >> index currently exists) >> 13/07/22 16:33:50 ERROR security.UserGroupInformation: >> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException: >> Failed on local exception: >> com.google.protobuf.InvalidProtocolBufferException: Protocol message >> end-group tag did not match expected tag.; Host Details : local host is: >> "CH22/192.168.10.22"; destination host is: "CH22":8088; >> Exception in thread "main" java.io.IOException: Failed on local >> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol >> message end-group tag did not match expected tag.; Host Details : local >> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088; >> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) >> at org.apache.hadoop.ipc.Client.call(Client.java:1229) >> at >> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225) >> at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown >> Source) >> at >> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324) >> at >> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102) >> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951) >> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:396) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) >> at >> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945) >> at org.apache.hadoop.mapreduce.Job.submit(Job.java:566) >> at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596) >> at >> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111) >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) >> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) >> at >> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115) >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> at java.lang.reflect.Method.invoke(Method.java:597) >> at org.apache.hadoop.util.RunJar.main(RunJar.java:208) >> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol >> message end-group tag did not match expected tag. >> at >> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73) >> at >> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124) >> at >> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213) >> at >> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746) >> at >> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238) >> at >> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282) >> at >> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760) >> at >> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288) >> at >> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752) >> at >> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985) >> at >> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938) >> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836) >> > > > > -- > --Regards > Sandeep Nemuri > --089e0111bd80f8cff304e2ed1995 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
i use yarn ,and i commented the following option and error is differen= t
=A0
=A0=20
vi /e= tc/hadoop/conf/mapred-site.xml
<!= --
=A0=A0=A0=A0=A0=A0=A0 <property>
=A0=A0=A0=A0=A0=A0=A0=A0=A0= =A0=A0=A0=A0=A0=A0 <name>mapred.job.tracker</name>
=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0=A0 <value>CH22:8088</va= lue>
=A0=A0=A0=A0=A0=A0=A0 </property>
-->
=A0=20
#=A0 hadoop jar /usr/lib/hadoop/lib/ha= doop-lzo-0.4.15.jar com.hadoop.compression.lzo.DistributedLzoIndexer /alex/= ttt.lzo
13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoa= der: Loaded native gpl library
13/08/02 09:25:51 INFO lzo.LzoCodec: Succes= sfully loaded & initialized native-lzo library [hadoop-lzo rev 6bb1b7f8= b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:25:52 INFO lzo.DistributedLzoIn= dexer: Adding LZO file /alex/ttt.lzo to indexing list (no index currently e= xists)
13/08/02 09:25:52 WARN conf.Configuration: = session.id is deprecated. Instead, use df= s.metrics.session-id
13/08/02 09:25:52 INFO jvm.JvmMetrics: Init= ializing JVM Metrics with processName=3DJobTracker, sessionId=3D
13/08/02 09:25:52 WARN conf.Configuration: = slave.host.name is deprecated. Inste= ad, use dfs.datanode.hostname
13/08/02 09:25:52 WARN mapred.JobClient: Us= e GenericOptionsParser for parsing the arguments. Applications should imple= ment Tool for the same.
13/08/02 09:25:52 INFO input.FileInputForma= t: Total input paths to process : 1
13/08/02 09:25:53 INFO mapred.LocalJobRunne= r: OutputCommitter set in config null
13/08/02 09:25:53 INFO mapred.JobClient: Ru= nning job: job_local180628093_0001
13/08/02 09:25:53 INFO mapred.LocalJobRunne= r: OutputCommitter is com.hadoop.mapreduce.LzoIndexOutputFormat$1 13/08/02 09:25:53 INFO mapred.LocalJobRunne= r: Waiting for map tasks
13/08/02 09:25:53 INFO mapred.LocalJobRunne= r: Starting task: attempt_local180628093_0001_m_000000_0
13/08/02 09:25:53 WARN mapreduce.Counters: = Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.h= adoop.mapreduce.TaskCounter instead
13/08/02 09:25:53 INFO util.ProcessTree: se= tsid exited with exit code 0
13/08/02 09:25:53 INFO mapred.Task:=A0 Usin= g ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculator= Plugin@338e18a3
13/08/02 09:25:53 INFO mapred.MapTask: Proc= essing split: hdfs://CH22:9000/alex/ttt.lzo:0+306
13/08/02 09:25:53 INFO mapred.LocalJobRunne= r: Map task executor complete.
13/08/02 09:25:53 WARN mapred.LocalJobRunne= r: job_local180628093_0001
java.lang.Exception: java.lang.Incompatible= ClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptCo= ntext, but class was expected
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.= mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.IncompatibleClassChang= eError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but= class was expected
=A0=A0=A0=A0=A0=A0=A0 at com.hadoop.mapredu= ce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.= mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)<= br style=3D"TEXT-TRANSFORM:none;TEXT-INDENT:0px;FONT:medium Tahoma;WHITE-SP= ACE:normal;LETTER-SPACING:normal;COLOR:rgb(0,0,0);WORD-SPACING:0px"> =A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.= mapred.MapTask.runNewMapper(MapTask.java:671)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.= mapred.MapTask.run(MapTask.java:330)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.= mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
=A0=A0=A0=A0=A0=A0=A0 at java.util.concurre= nt.Executors$RunnableAdapter.call(Executors.java:441)
=A0=A0=A0=A0=A0=A0=A0 at java.util.concurre= nt.FutureTask$Sync.innerRun(FutureTask.java:303)
=A0=A0=A0=A0=A0=A0=A0 at java.util.concurre= nt.FutureTask.run(FutureTask.java:138)
=A0=A0=A0=A0=A0=A0=A0 at java.util.concurre= nt.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) =A0=A0=A0=A0=A0=A0=A0 at java.util.concurre= nt.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
=A0=A0=A0=A0=A0=A0=A0 at java.lang.Thread.r= un(Thread.java:662)
13/08/02 09:25:54 INFO mapred.JobClient:=A0= map 0% reduce 0%
13/08/02 09:25:54 INFO mapred.JobClient: Jo= b complete: job_local180628093_0001
13/08/02 09:25:54 INFO mapred.JobClient: Co= unters: 0

On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri = <nhsandeep6@gmail.com> wrote:
Try =A0this Ccommand
hadoop jar /usr/lib/hadoop/lib/had= oop-lzo-cdh4-0.4.15-gplextras.jar com.hadoop.compression.lzo.LzoIndexer /us= er/sample.txt.lzo


On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju= stlooks@gmail.com> wrote:
anyone can help?
=A0
# sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar co= m.hadoop.compression.lzo.DistributedLzoIndexer /alex/test_lzo/sqoop-1.99.2-= bin-hadoop200.tar.gz.lzo
13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader:= Loaded native gpl library
13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized = native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8= ]
13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file /ale= x/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no index= currently exists)
13/07/22 16:33:50 ERROR security.UserGroupInformation: PriviledgedActionExc= eption as:hdfs (auth:SIMPLE) cause:java.io.IOException: Failed on local exc= eption: com.google.protobuf.InvalidProtocolBufferException: Protocol messag= e end-group tag did not match expected tag.; Host Details : local host is: = "CH22/192.168.10.2= 2"; destination host is: "CH22":8088;
Exception in thread "main" java.io.IOException: Failed on local e= xception: com.google.protobuf.InvalidProtocolBufferException: Protocol mess= age end-group tag did not match expected tag.; Host Details : local host is= : "CH22/192.168.10= .22"; destination host is: "CH22":8088;
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.net.NetUtils.wrapException(NetUt= ils.java:763)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.ipc.Client.call= (Client.java:1229)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.ipc.Writab= leRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDi= r(Unknown Source)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.JobC= lient.getStagingAreaDir(JobClient.java:1324)
=A0=A0=A0=A0=A0=A0=A0 at or= g.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFil= es.java:102)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.JobClient$2.run(JobClient= .java:951)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapred.JobClient$2= .run(JobClient.java:945)
=A0=A0=A0=A0=A0=A0=A0 at java.security.AccessCo= ntroller.doPrivileged(Native Method)
=A0=A0=A0=A0=A0=A0=A0 at javax.security.auth.Subject.doAs(Subject.java:396)=
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.security.UserGroupInformatio= n.doAs(UserGroupInformation.java:1408)
=A0=A0=A0=A0=A0=A0=A0 at org.apac= he.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapreduce.Job.submit(Job.java:56= 6)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.mapreduce.Job.waitForCompl= etion(Job.java:596)
=A0=A0=A0=A0=A0=A0=A0 at com.hadoop.compression.lzo.= DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j= ava:70)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.util.ToolRunner.run(T= oolRunner.java:84)
=A0=A0=A0=A0=A0=A0=A0 at com.hadoop.compression.lzo.D= istributedLzoIndexer.main(DistributedLzoIndexer.java:115)
=A0=A0=A0=A0=A0=A0=A0 at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ= e Method)
=A0=A0=A0=A0=A0=A0=A0 at sun.reflect.NativeMethodAccessorImpl.= invoke(NativeMethodAccessorImpl.java:39)
=A0=A0=A0=A0=A0=A0=A0 at sun.re= flect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java= :25)
=A0=A0=A0=A0=A0=A0=A0 at java.lang.reflect.Method.invoke(Method.java:597)=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.util.RunJar.main(RunJar.java:2= 08)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Proto= col message end-group tag did not match expected tag.
=A0=A0=A0=A0=A0=A0=A0 at com.google.protobuf.InvalidProtocolBufferException= .invalidEndTag(InvalidProtocolBufferException.java:73)
=A0=A0=A0=A0=A0= =A0=A0 at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputSt= ream.java:124)
=A0=A0=A0=A0=A0=A0=A0 at com.google.protobuf.AbstractMess= ageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
=A0=A0=A0=A0=A0=A0=A0 at com.google.protobuf.AbstractMessage$Builder.mergeF= rom(AbstractMessage.java:746)
=A0=A0=A0=A0=A0=A0=A0 at com.google.protob= uf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
=A0=A0=A0= =A0=A0=A0=A0 at com.google.protobuf.AbstractMessageLite$Builder.mergeDelimi= tedFrom(AbstractMessageLite.java:282)
=A0=A0=A0=A0=A0=A0=A0 at com.google.protobuf.AbstractMessage$Builder.mergeD= elimitedFrom(AbstractMessage.java:760)
=A0=A0=A0=A0=A0=A0=A0 at com.goog= le.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageL= ite.java:288)
=A0=A0=A0=A0=A0=A0=A0 at com.google.protobuf.AbstractMessa= ge$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderPro= tos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:9= 85)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.ipc.Client$Connection.rec= eiveResponse(Client.java:938)
=A0=A0=A0=A0=A0=A0=A0 at org.apache.hadoop.ipc.Client$Connection.run(Client= .java:836)



-= -
--Regards=20
=A0 Sandeep Nemuri
<= br> --089e0111bd80f8cff304e2ed1995--