Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id B3C50D214 for ; Tue, 17 Jul 2012 14:56:56 +0000 (UTC) Received: (qmail 84878 invoked by uid 500); 17 Jul 2012 14:56:55 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 84827 invoked by uid 500); 17 Jul 2012 14:56:55 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 84802 invoked by uid 99); 17 Jul 2012 14:56:54 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Jul 2012 14:56:54 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=FSL_RCVD_USER,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of kulkarni.swarnim@gmail.com designates 209.85.160.176 as permitted sender) Received: from [209.85.160.176] (HELO mail-gh0-f176.google.com) (209.85.160.176) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Jul 2012 14:56:47 +0000 Received: by ghbz10 with SMTP id z10so519432ghb.35 for ; Tue, 17 Jul 2012 07:56:26 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :cc:content-type; bh=AU98sqCn/sLnb00ldd7UE5yUMa3/V08pfMe1fnVi2jM=; b=LB3pbQuGGLk/hGw4nKOEMvu5DwUVy0kDujyOiO/UhuEqrmL6T+YxHhYtLvNG1qcUdB G2xJpcIjcJ4CBANREAVSJ/Pqz93FNfSmHoL1FKPDkHtlgmzIv3c5q2iY5YY60Bo4BfVJ 20DrdTvN4wp7vpoezhL+8MT9Snoze02sGr7H0BozJZf3D/pzWAmgAZKqtNt2hgSZtDa3 /sJgrXUfmkdwgEXq7pZQaUpaNzfzQj4rXMg9jgx3TNiWMjY40ckBEe3uj2vN1tb7+K4v VN/OePvJlVsoQjks6Ap8e+GGmMnME5b5x+zDzh0iCztL93ZXdapWvI8aAW0R1qHRcMsY Ue1A== Received: by 10.60.20.74 with SMTP id l10mr3950070oee.19.1342536986306; Tue, 17 Jul 2012 07:56:26 -0700 (PDT) MIME-Version: 1.0 Received: by 10.76.153.97 with HTTP; Tue, 17 Jul 2012 07:56:06 -0700 (PDT) In-Reply-To: References: From: "kulkarni.swarnim@gmail.com" Date: Tue, 17 Jul 2012 09:56:06 -0500 Message-ID: Subject: Re: ipc.RemoteException in Hive To: user@hive.apache.org Cc: a.sverdlov@rambler-co.ru Content-Type: multipart/alternative; boundary=e89a8f923f7c2aaa1a04c507c0b7 --e89a8f923f7c2aaa1a04c507c0b7 Content-Type: text/plain; charset=KOI8-R Content-Transfer-Encoding: quoted-printable "select *" queries don't really run a M/R job. Rather directly hit HDFS to grab the results. While "select count(*)" run mappers/reducers to perform the count on the data. The former running and the latter not suspects something might be wrong with your hadoop installation. Looking at the stacktrace, it even seems like the user executing this query might not have proper access. Are you able to run simple M/R jobs with the installation? You might also want to check on permissions. On Tue, Jul 17, 2012 at 9:24 AM, =F0=C1=D7=C5=CC =ED=C5=DA=C5=CE=C3=C5=D7 <= pavel@mezentsev.org> wrote: > Hello all! > > We have a trouble with hive. > My colleague created table "as_test" in hive > create external table as_test (line STRING) location '/logs/2012-07-16' > > Query > select * from as_test limit 10; > compliting successfully > > but query > select count (1) from as_test limit 10; > raise org.apache.hadoop.ipc.RemoteException. > How we can fix it? > > Best regards, > Pavel > > P.S. full stack trace: > > org.apache.hadoop.ipc.RemoteException: IPC server unable to read call > parameters : readObject can't find class > org.apache.hadoop.fs.permission.FsPermission$2 > > at org.apache.hadoop.ipc.Client.call(Client.java:1107) > > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) > > at $Proxy4.setPermission(Unknown Source) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. > java:39) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvoc= ationHandler.java:82) > > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationH= andler.java:59) > > at $Proxy4.setPermission(Unknown Source) > > at > org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:855) > > at > org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFil= eSystem.java:560) > > at > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissio= nFiles.java:123) > > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839) > > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation= .java:1127) > > at > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) > > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:80= 7) > > at > org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657) > > at > org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123) > > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130) > > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:5= 7) > > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063) > > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900) > > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748) > > at > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164) > > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241) > > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :39) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:597) > > at org.apache.hadoop.util.RunJar.main(RunJar.java:186) > > Job Submission failed with exception > 'org.apache.hadoop.ipc.RemoteException(IPC server unable to read call > parameters: readObject can't find class > org.apache.hadoop.fs.permission.FsPermission$2)' > > FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.MapRedTask > > --=20 Swarnim --e89a8f923f7c2aaa1a04c507c0b7 Content-Type: text/html; charset=KOI8-R Content-Transfer-Encoding: quoted-printable "select *" queries don't really run a M/R job. Rather directl= y hit HDFS to grab the results. While "select count(*)" run mappe= rs/reducers to perform the count on the data. The former running and the la= tter not suspects something might be wrong with your hadoop installation. L= ooking at the stacktrace, it even seems like the user executing this query = might not have proper access.

Are you able to run simple M/R jobs with the installation? Y= ou might also want to check on permissions.

On Tue, Jul 17, 2012 at 9:24 AM, =F0=C1=D7=C5=CC =ED=C5=DA=C5=CE= =C3=C5=D7 <pavel@mezentsev.org> wrote:
Hello all!

We have a trouble with hiv= e.
My colleague created table "as_test" in h= ive
create external = table as_test (line STRING) location '/logs/2012-07-16'
<= br>Query
select * from as= _test limit 10;
compliting successfully
but query
select count (1) from as_test limit 10;
raise org.apache.hadoop.ipc.RemoteException.
How we can fix it?

Best regards,
Pavel

P.S. full stack trace:

org.apache.hadoop.ipc.RemoteE= xception: IPC server unable to read call parameters : readObject can't find clas= s org.apache.hadoop.fs.permission.FsPermission$2

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.ipc.Client.call(Client.java:1107)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)

=9A=9A= =9A=9A=9A=9A=9A at $Proxy4.setPermission(Unknown Source)

=9A=9A= =9A=9A=9A=9A=9A at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)

=9A=9A= =9A=9A=9A=9A=9A at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl. java:39)

=9A=9A= =9A=9A=9A=9A=9A at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)

=9A=9A= =9A=9A=9A=9A=9A at java.lang.reflect.Method.invoke(Method.java:597)<= /p>

=9A =9A= =9A=9A=9A=9A=9Aat org.apache.hadoop.io.retry.RetryInvocationHandler.invokeM= ethod(RetryInvocationHandler.java:82)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke= (RetryInvocationHandler.java:59)

=9A=9A= =9A=9A=9A=9A=9A at $Proxy4.setPermission(Unknown Source)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient= .java:855)

=9A=9A= =9A=9A =9A=9A=9Aat org.apache.hadoop.hdfs.DistributedFileSystem.setPermissi= on(DistributedFileSystem.java:560)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagin= gDir(JobSubmissionFiles.java:123)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:= 839)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:= 833)

=9A=9A= =9A=9A=9A=9A=9A at java.security.AccessController.doPrivileged(Native Metho= d)

=9A=9A= =9A=9A=9A=9A=9A at javax.security.auth.Subject.doAs(Subject.java:396)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.security.UserGroupInformation.doAs(Use= rGroupInformation.java:1127)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.mapred.JobClient.submitJobInternal(Job= Client.java:833)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.j= ava:807)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDr= iver.java:657)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRed= Task.java:123)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.jav= a:130)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(= TaskRunner.java:57)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:= 1063)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900= )

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDrive= r.java:164)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriv= er.java:241)

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java= :456)

=9A=9A= =9A=9A=9A=9A=9A at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Meth= od)

=9A=9A= =9A=9A=9A=9A=9A at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethod= AccessorImpl.java:39)

=9A=9A= =9A=9A=9A=9A=9A at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegati= ngMethodAccessorImpl.java:25)

=9A=9A= =9A=9A=9A=9A=9A at java.lang.reflect.Method.invoke(Method.java:597)<= /p>

=9A=9A= =9A=9A=9A=9A=9A at org.apache.hadoop.util.RunJar.main(RunJar.java:186)

Job=20 Submission failed with exception=20 'org.apache.hadoop.ipc.RemoteException(IPC server unable to read call= =20 parameters: readObject can't find class=20 org.apache.hadoop.fs.permission.FsPermission$2)'

FAILED:= Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedT= ask





--
Swarnim
--e89a8f923f7c2aaa1a04c507c0b7--