Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id C296910CED for ; Mon, 5 Aug 2013 11:42:09 +0000 (UTC) Received: (qmail 44355 invoked by uid 500); 5 Aug 2013 11:42:08 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 44049 invoked by uid 500); 5 Aug 2013 11:42:01 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 44019 invoked by uid 99); 5 Aug 2013 11:42:00 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 05 Aug 2013 11:42:00 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of sachin.hadoop@gmail.com designates 209.85.219.52 as permitted sender) Received: from [209.85.219.52] (HELO mail-oa0-f52.google.com) (209.85.219.52) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 05 Aug 2013 11:41:55 +0000 Received: by mail-oa0-f52.google.com with SMTP id n12so5830954oag.39 for ; Mon, 05 Aug 2013 04:41:34 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=lMl4YjthKrLZRZBM5N64RWoJfZjD+XyPtxyqTJmifuI=; b=nglVp3H32cHYxU1dPY5snleUSp5CDn2rUQkqi59JFkGiG2M2DYfnsyfnFJ7neJTTq5 mG1Tje2Guzoa7lfwRv66YTU/i0AWIsiR6a9mZD9YhBgmMPOdDlAElRqFdwizM2Lu04nb EbqGW15Z95ZwI74Cpc2Kxsm50gIzzmXGKDTJ1GUdHKUYKj3/rF6n5jA5Ubcb+/A5Xavm twqiPR2fMAs25d7vLTq8T7UJi0qIa+ZSdNXdKG0muo4myfCEyvRiaCD5BFZDhrQAxNwX pHF0DktCqtj6ngSrF5RxySPfJ9nqXYjh8yiccXW0kVUOgAqRDetr3jgPGRShtzjmYEQ4 urtA== MIME-Version: 1.0 X-Received: by 10.42.33.129 with SMTP id i1mr1615601icd.95.1375702894613; Mon, 05 Aug 2013 04:41:34 -0700 (PDT) Received: by 10.43.153.4 with HTTP; Mon, 5 Aug 2013 04:41:34 -0700 (PDT) Date: Mon, 5 Aug 2013 17:11:34 +0530 Message-ID: Subject: Loading data in Hive 0.11 - permission issue From: Sachin Sudarshana To: user@hive.apache.org Content-Type: multipart/alternative; boundary=bcaec51018f7599a8904e331cae2 X-Virus-Checked: Checked by ClamAV on apache.org --bcaec51018f7599a8904e331cae2 Content-Type: text/plain; charset=ISO-8859-1 Hi, I'm using Hive 0.11, downloaded the tarball from Apache's website. I have a Linux user called *admin * and i invoke the hive CLI using this user. In the hive terminal I created a table as follows: *hive> create table ptest (pkey INT, skey INT, fkey INT, rkey INT, units INT) row format delimited fields terminated by ',' lines terminated by '\n' stored as textfile;* *OK* *Time taken: 0.241 seconds* When i try to load data into the table i get the following error: *hive> LOAD DATA LOCAL INPATH '/home/admin/sample.csv' OVERWRITE INTO TABLE ptest;* *Copying data from file:/home/admin/sample.csv* *Copying file: file:/home/admin/sample.csv* *Loading data to table default.ptest* *rmr: DEPRECATED: Please use 'rm -r' instead.* *rmr: Permission denied: user=admin, access=ALL, inode="/user/hive_0.11/warehouse/ptest":root:hive:drwxr-xr-x* *Failed with exception Permission denied: user=admin, access=ALL, inode="/user/hive_0.11/warehouse/ptest":root:hive:drwxr-xr-x* * at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205) * * at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSubAccess(FSPermissionChecker.java:174) * * at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:144) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4684) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:2794) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:2757) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:2740) * * at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:621) * * at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:406) * * at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44094) * * at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) * * at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)* * at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)* * at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)* * at java.security.AccessController.doPrivileged(Native Method)* * at javax.security.auth.Subject.doAs(Subject.java:415)* * at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) * * at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)* * * *FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask* When I looked into the warehouse directory, *hive>dfs -ls /user/hive_0.11/warehouse;* *Found 1 item* *drwxr-xr-x - root hive 0 2013-08-05 17:04 /user/hive_0.11/warehouse/ptest* It seems like the file is owned by the root user even though it was created by invoking hive CLI from the user admin. I'm unable to figure out why the owner of the table has been assigned as root. Could anyone please help me out? Thank you, Sachin --bcaec51018f7599a8904e331cae2 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable Hi,

I'm using Hive 0.11, downloaded the tarball from= Apache's website.

I have a Linux user called = admin =A0and i invoke the hive CLI using this user.

In the hive terminal I created a table as follows:

hive> create table ptest (pkey INT, skey INT, fkey INT,= rkey INT, units INT) row format delimited fields terminated by ','= lines terminated by '\n' stored as textfile;
OK
Time taken: 0.241 seconds

When i try to load data into the table i get th= e following error:

hive> LOAD DATA L= OCAL INPATH '/home/admin/sample.csv' OVERWRITE INTO TABLE ptest;
Copying data from file:/home/admin/sample.csv
= Copying file: file:/home/admin/sample.csv
Loa= ding data to table default.ptest
rmr: DEPRECATED: P= lease use 'rm -r' instead.
rmr: Permission denied: user=3Dadmin, access=3DALL, inode=3D&quo= t;/user/hive_0.11/warehouse/ptest":root:hive:drwxr-xr-x
<= div>Failed with exception Permission denied: user=3Dadmin, access=3DA= LL, inode=3D"/user/hive_0.11/warehouse/ptest":root:hive:drwxr-xr-= x
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSPerm= issionChecker.check(FSPermissionChecker.java:205)
= =A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSPermissionCheck= er.checkSubAccess(FSPermissionChecker.java:174)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSPerm= issionChecker.checkPermission(FSPermissionChecker.java:144)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesy= stem.checkPermission(FSNamesystem.java:4684)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSName= system.deleteInternal(FSNamesystem.java:2794)
=A0 = =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteIn= t(FSNamesystem.java:2757)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.server.namenode.FSName= system.delete(FSNamesystem.java:2740)
=A0 =A0 =A0 = =A0 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(Name= NodeRpcServer.java:621)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamen= odeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTr= anslatorPB.java:406)
=A0 =A0 =A0 =A0 at org.apache.= hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProto= col$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44094)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Serve= r$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
= =A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)<= /b>
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Se= rver.java:1695)
=A0 =A0 =A0 =A0 at org.apache.hadoo= p.ipc.Server$Handler$1.run(Server.java:1691)
=A0 = =A0 =A0 =A0 at java.security.AccessController.doPrivileged(Native Method)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Subject.java= :415)
=A0 =A0 =A0 =A0 at org.apache.hadoop.security= .UserGroupInformation.doAs(UserGroupInformation.java:1408)
=A0 =A0 =A0 =A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.= java:1689)

FAILED: Execution Error, return cod= e 1 from org.apache.hadoop.hive.ql.exec.MoveTask

When I looked into the warehouse directory,

hive>dfs -ls /user/hive_0.11/warehouse;
Found 1 item
drwxr-xr-x =A0 = - root =A0hive =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A00 2013-08-05 17:04 /user/hive= _0.11/warehouse/ptest

It seems like the file is owned by the root user = even though it was created by invoking hive CLI from the user admin.
<= div>
I'm unable to figure out why the owner of the table = has been assigned as root. Could anyone please help me out?

Thank you,
Sachin

--bcaec51018f7599a8904e331cae2--