Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4F3A5D7A4 for ; Mon, 4 Mar 2013 07:08:32 +0000 (UTC) Received: (qmail 24443 invoked by uid 500); 4 Mar 2013 07:08:27 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 24178 invoked by uid 500); 4 Mar 2013 07:08:26 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 24148 invoked by uid 99); 4 Mar 2013 07:08:26 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 04 Mar 2013 07:08:26 +0000 X-ASF-Spam-Status: No, hits=2.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of anilgupta84@gmail.com designates 74.125.83.52 as permitted sender) Received: from [74.125.83.52] (HELO mail-ee0-f52.google.com) (74.125.83.52) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 04 Mar 2013 07:08:21 +0000 Received: by mail-ee0-f52.google.com with SMTP id b15so3381609eek.25 for ; Sun, 03 Mar 2013 23:07:59 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:in-reply-to:references:from:date:message-id :subject:to:cc:content-type; bh=h3D5oILud9evSmtXUo2NEF2kZxDxY7QY2xrd66CrjCc=; b=GSvzkEKBb4kuMrN+2Lu25LtnmfqaZr7gsrPGfh95m9F5pGGABu9LdRVrJguaeHEGT1 FGuVDNVfHbMP6iRp0xA3AnoFvlf14nzzEZ4iGUWYjHSnSOGFHvBlfjAPGjGGMcl/bqd7 8Z+svjxO25ofOZgCadPxKww40hco2QQX+0dPCm/+qyKkOFy/xGIB4yUE0mkaNjL/nWbu myVAVgmhe/nL7PceoVdAQXRLv/P3KuICJC/7qLBTe8Jxz1oObozoZklgiwKa58fVuFwE YfB/vVLh0fmj0kL0Zo+1rDteLX42h1/C6N5XlGbLVL34YnKCAhIxPZYJkMP57vqYZs6N vz8w== X-Received: by 10.14.202.71 with SMTP id c47mr41492029eeo.39.1362380879894; Sun, 03 Mar 2013 23:07:59 -0800 (PST) MIME-Version: 1.0 Received: by 10.223.13.201 with HTTP; Sun, 3 Mar 2013 23:07:39 -0800 (PST) In-Reply-To: References: From: anil gupta Date: Sun, 3 Mar 2013 23:07:39 -0800 Message-ID: Subject: Re: How to Create file in HDFS using java Client with Permission To: user@hadoop.apache.org Cc: cdh-user@cloudera.org Content-Type: multipart/alternative; boundary=047d7b343e38654f4004d7140449 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b343e38654f4004d7140449 Content-Type: text/plain; charset=ISO-8859-1 As per the error below, the user trying to write/read the file does not have appropriate permission. File not found org.apache.hadoop.security. AccessControlException: Permission denied: user=hadoop, access=WRITE, inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x HTH, Anil On Fri, Mar 1, 2013 at 3:20 AM, samir das mohapatra wrote: > Hi All, > I wanted to know to to create file in hdfs using java program. > > I wrote some code it is working fine in dev cluster but I am getting > error in other cluster. > > Error: > Writing data into HDFS................... > Creating file > File not found org.apache.hadoop.security.AccessControlException: > Permission denied: user=hadoop, access=WRITE, > inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4547) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4518) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1755) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1690) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1669) > at > org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:409) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:205) > at > org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44068) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687) > > > Regards, > samir. > -- Thanks & Regards, Anil Gupta --047d7b343e38654f4004d7140449 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable As per the error below, the user trying to write/read the file does not hav= e appropriate permission.

File not found org.apache.hadoop.security.=
AccessControlException: Permission denied: user=3Dhadoop, access=3DWRI= TE, inode=3D"/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr= -x

HTH,
Anil

On Fri, Mar 1, 201= 3 at 3:20 AM, samir das mohapatra <samir.helpdoc@gmail.com> wrote:
Hi All,=
=A0=A0=A0 I wanted to know to to create file in hdfs using java p= rogram.

=A0 I wrote some code it is working fine in dev cluster but I am = getting error in other cluster.

Error:
Writing data into HDFS...................
Creating f= ile
File not found org.apache.hadoop.security.AccessControlException: Pe= rmission denied: user=3Dhadoop, access=3DWRITE, inode=3D"/user/dasmoha= p/samir_tmp":dasmohap:dasmohap:drwxr-xr-x
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.che= ck(FSPermissionChecker.java:205)
=A0=A0=A0 at org.apache.hadoop.hdfs.ser= ver.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
=A0= =A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkP= ermission(FSPermissionChecker.java:135)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermi= ssion(FSNamesystem.java:4547)
=A0=A0=A0 at org.apache.hadoop.hdfs.server= .namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4518)
=A0= =A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInte= rnal(FSNamesystem.java:1755)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileI= nt(FSNamesystem.java:1690)
=A0=A0=A0 at org.apache.hadoop.hdfs.server.na= menode.FSNamesystem.startFile(FSNamesystem.java:1669)
=A0=A0=A0 at org.a= pache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServe= r.java:409)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServer= SideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:2= 05)
=A0=A0=A0 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodePro= tocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProto= colProtos.java:44068)
=A0=A0=A0 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvo= ker.call(ProtobufRpcEngine.java:453)
=A0=A0=A0 at org.apache.hadoop.ipc.= RPC$Server.call(RPC.java:898)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$= Handler$1.run(Server.java:1693)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)=A0=A0=A0 at java.security.AccessController.doPrivileged(Native Method)=A0=A0=A0 at javax.security.auth.Subject.doAs(Subject.java:396)
=A0=A0= =A0 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform= ation.java:1332)
=A0=A0=A0 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
=

Regards,
samir.



--
Thanks & Regards,Anil Gupta --047d7b343e38654f4004d7140449--