Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id DCF99DE46 for ; Fri, 15 Mar 2013 09:33:43 +0000 (UTC) Received: (qmail 91774 invoked by uid 500); 15 Mar 2013 09:33:39 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 91586 invoked by uid 500); 15 Mar 2013 09:33:38 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 91571 invoked by uid 99); 15 Mar 2013 09:33:38 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 15 Mar 2013 09:33:38 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of yanbohappy@gmail.com designates 209.85.215.49 as permitted sender) Received: from [209.85.215.49] (HELO mail-la0-f49.google.com) (209.85.215.49) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 15 Mar 2013 09:33:33 +0000 Received: by mail-la0-f49.google.com with SMTP id fs13so3447582lab.8 for ; Fri, 15 Mar 2013 02:33:12 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type; bh=dYEvOkzWzptqnNWs9N5i4JRSYPbtKyINqhoGAPUOsLo=; b=PVWS7cTRpdWVrYOMbqcQIfV0ZGzKpi8Mho+KEpRgAD32ay5rHJCciVay7lrwqPHUKk EGjNYwwQmKfaB92scCuOze6vZjDBfnJXzD6vJWlXqgLZh6NykkKU1PFTQ2ZMFurCaFAV TQPOcJ+orxo6fjcg6lkNJ/kqLH+ULK4Z40lI00FEkQTS14f3vJkocs5+/WatFBT9IOiL jOTBLIfqubxetx3c8kzVH0wg1u9XXca7AdPt779ZW8ezREvGpfIdkFrxGc//b6zctNYR FFDx2n8ACe76WztWqI4sw1vsKIG12wQP0V8RxzAISGj1VK1GbdTYCtplJQR4d4oq2A6I gxLg== MIME-Version: 1.0 X-Received: by 10.152.122.100 with SMTP id lr4mr5087264lab.28.1363339992113; Fri, 15 Mar 2013 02:33:12 -0700 (PDT) Received: by 10.114.12.2 with HTTP; Fri, 15 Mar 2013 02:33:11 -0700 (PDT) In-Reply-To: References: Date: Fri, 15 Mar 2013 17:33:11 +0800 Message-ID: Subject: Re: How to Create file in HDFS using java Client with Permission From: Yanbo Liang To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=f46d042ef661f027f604d7f35317 X-Virus-Checked: Checked by ClamAV on apache.org --f46d042ef661f027f604d7f35317 Content-Type: text/plain; charset=ISO-8859-1 You must change to user dasmohap to execute this client program otherwise you can not create file under the directory "/user/dasmohap". If you do not have a user called dasmohap at client machine, create it or hack as these step http://stackoverflow.com/questions/11371134/how-to-specify-username-when-putting-files-on-hdfs-from-a-remote-machine . But I think these hack steps are not security. The users to upload their own data is make sense. 2013/3/4 anil gupta > As per the error below, the user trying to write/read the file does not > have appropriate permission. > > File not found org.apache.hadoop.security. > AccessControlException: Permission denied: user=hadoop, access=WRITE, > inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x > > HTH, > Anil > > On Fri, Mar 1, 2013 at 3:20 AM, samir das mohapatra < > samir.helpdoc@gmail.com> wrote: > >> Hi All, >> I wanted to know to to create file in hdfs using java program. >> >> I wrote some code it is working fine in dev cluster but I am getting >> error in other cluster. >> >> Error: >> Writing data into HDFS................... >> Creating file >> File not found org.apache.hadoop.security.AccessControlException: >> Permission denied: user=hadoop, access=WRITE, >> inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x >> at >> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205) >> at >> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186) >> at >> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4547) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4518) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1755) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1690) >> at >> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1669) >> at >> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:409) >> at >> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:205) >> at >> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44068) >> at >> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) >> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693) >> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689) >> at java.security.AccessController.doPrivileged(Native Method) >> at javax.security.auth.Subject.doAs(Subject.java:396) >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) >> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687) >> >> >> Regards, >> samir. >> > > > > -- > Thanks & Regards, > Anil Gupta --f46d042ef661f027f604d7f35317 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable You must change to user dasmohap to execute this client program otherwise y= ou can not create file under the directory "/user/dasmohap".
= If you do not have a user called dasmohap at client machine, create it or h= ack as these step htt= p://stackoverflow.com/questions/11371134/how-to-specify-username-when-putti= ng-files-on-hdfs-from-a-remote-machine.

But I think these hack steps are not security.=A0
=
The users to upload their own data is make sense.

=

2013/3/4 anil gupta &= lt;anilgupta84@g= mail.com>
As per the error below, the user trying to w= rite/read the file does not have appropriate permission.

File not fo= und org.apache.hadoop.security.
AccessControlException: Permission denied: user=3Dhadoop,= access=3DWRITE, inode=3D"/user/dasmohap/samir_tmp":dasmohap:dasm= ohap:drwxr-xr-x

HTH,
Anil
On Fri, Mar 1, 2013 at 3:20 AM, samir das mohap= atra <samir.helpdoc@gmail.com> wrote:
Hi All,=
=A0=A0=A0 I wanted to know to to create file in hdfs using java p= rogram.

=A0 I wrote some code it is working fine in dev cluster but I am = getting error in other cluster.

Error:
Writing data into HDFS...................
Creating f= ile
File not found org.apache.hadoop.security.AccessControlException: Pe= rmission denied: user=3Dhadoop, access=3DWRITE, inode=3D"/user/dasmoha= p/samir_tmp":dasmohap:dasmohap:drwxr-xr-x
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.che= ck(FSPermissionChecker.java:205)
=A0=A0=A0 at org.apache.hadoop.hdfs.ser= ver.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
=A0= =A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkP= ermission(FSPermissionChecker.java:135)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermi= ssion(FSNamesystem.java:4547)
=A0=A0=A0 at org.apache.hadoop.hdfs.server= .namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4518)
=A0= =A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInte= rnal(FSNamesystem.java:1755)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileI= nt(FSNamesystem.java:1690)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.na= menode.FSNamesystem.startFile(FSNamesystem.java:1669)
=A0=A0=A0 at org.a= pache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServe= r.java:409)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServer= SideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:2= 05)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodePro= tocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProto= colProtos.java:44068)
=A0=A0=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvo= ker.call(ProtobufRpcEngine.java:453)
=A0=A0=A0 at org.apache.hadoop.ipc.= RPC$Server.call(RPC.java:898)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$= Handler$1.run(Server.java:1693)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)=A0=A0=A0 at java.security.AccessController.doPrivileged(Native Method)=A0=A0=A0 at javax.security.auth.Subject.doAs(Subject.java:396)
=A0=A0= =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform= ation.java:1332)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
=

Regards,
samir.



--
Thanks & Regards,
Anil Gupta

--f46d042ef661f027f604d7f35317--