Return-Path: Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: (qmail 10450 invoked from network); 10 Mar 2010 15:46:01 -0000 Received: from unknown (HELO mail.apache.org) (140.211.11.3) by 140.211.11.9 with SMTP; 10 Mar 2010 15:46:01 -0000 Received: (qmail 55262 invoked by uid 500); 10 Mar 2010 15:45:30 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 55221 invoked by uid 500); 10 Mar 2010 15:45:30 -0000 Mailing-List: contact hdfs-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hdfs-user@hadoop.apache.org Delivered-To: mailing list hdfs-user@hadoop.apache.org Received: (qmail 55212 invoked by uid 99); 10 Mar 2010 15:45:30 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 10 Mar 2010 15:45:30 +0000 X-ASF-Spam-Status: No, hits=-0.1 required=10.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_MED,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of maalvarez@us.es designates 193.147.175.20 as permitted sender) Received: from [193.147.175.20] (HELO mail.us.es) (193.147.175.20) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 10 Mar 2010 15:45:27 +0000 Received: (qmail 18030 invoked from network); 10 Mar 2010 16:45:04 +0100 Received: from unknown (HELO us.es) (192.168.2.12) by us.es with SMTP; 10 Mar 2010 16:45:04 +0100 Received: (qmail 5769 invoked by uid 507); 10 Mar 2010 15:45:02 -0000 X-Spam-Checker-Version: SpamAssassin 3.2.5 (2008-06-10) on antivirus2 X-Spam-Level: Received: from 127.0.0.1 by antivirus2 (envelope-from , uid 501) with qmail-scanner-2.06 (clamdscan: 0.95.3/10546. Clear:RC:1(127.0.0.1):. Processed in 0.136793 secs); 10 Mar 2010 15:45:02 -0000 Received: from unknown (HELO us.es) (127.0.0.1) by us.es with SMTP; 10 Mar 2010 15:45:02 -0000 Received: (qmail 11564 invoked from network); 10 Mar 2010 16:44:59 +0100 Received: from unknown (HELO varadero) (maalvarez@us.es@150.214.2.144) by us.es with (AES128-SHA encrypted) SMTP; 10 Mar 2010 16:44:59 +0100 From: =?UTF-8?Q?Miguel_=C3=81ngel_=C3=81lvarez_de_la_Conce?= =?UTF-8?Q?pci=C3=B3n?= To: References: <8211a1321003090813p1a087917s53caa163a4ecf699@mail.gmail.com> <8211a1321003100510k31637367sb0eeba5ad3ebf66b@mail.gmail.com> <8211a1321003100622p2770d308o67fb873f345dea02@mail.gmail.com> In-Reply-To: <8211a1321003100622p2770d308o67fb873f345dea02@mail.gmail.com> Subject: RE: Accessing Hadoop DFS for Data Storage and Retrieval Using Java Date: Wed, 10 Mar 2010 16:45:00 +0100 Message-ID: MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_NextPart_000_0050_01CAC071.06950F70" X-Mailer: Microsoft Office Outlook 12.0 Thread-Index: AcrAXU0CR/+dvtzERoSDp6hkCmQV8gACmSqA Content-Language: es X-Old-Spam-Status: No, score=0.1 required=6.5 tests=BAYES_50,HTML_MESSAGE, RDNS_NONE autolearn=disabled version=3.2.5 This is a multi-part message in MIME format. ------=_NextPart_000_0050_01CAC071.06950F70 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable I solved the problem by reading from a stream. I write the whole code for anyone who wants to see it: =20 public class testHadoop { public static final String DIR_HADOOP =3D "hdfs://my.machine.com"; public static final String PORT_HADOOP =3D "9000"; =20 public static void main(String[] args) { Configuration config =3D new Configuration(); config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP); config.set("hadoop.job.ugi", "root, supergroup"); =20 try { FileSystem hadoopFileSystem =3D FileSystem.get(config); =20 Path hadoopDirectory =3D new = Path(hadoopFileSystem.getWorkingDirectory() + "/test"); =20 hadoopFileSystem.mkdirs(hadoopDirectory); =20 Path directorioOrigen =3D new = Path("C://Windows/media/ringout.wav"); =20 hadoopFileSystem.copyFromLocalFile(directorioOrigen, = hadoopDirectory); =20 Path ficheroOrigen =3D new = Path(hadoopFileSystem.getWorkingDirectory() + "/test/ringout.wav"); FSDataInputStream in =3D = hadoopFileSystem.open(ficheroOrigen); FileOutputStream out =3D new = FileOutputStream("C://ringout.wav"); =20 while(in.available() > 0) { out.write(in.readByte()); } =20 out.close(); in.close(); =20 hadoopFileSystem.delete(hadoopDirectory, true); } catch(IOException ex) { = Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, = ex); } } } =20 Thanks for all! =20 De: Jeff Zhang [mailto:zjffdu@gmail.com]=20 Enviado el: mi=C3=A9rcoles, 10 de marzo de 2010 15:22 Para: hdfs-user@hadoop.apache.org Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using = Java =20 Yes, I think so. 2010/3/10 Miguel =C3=81ngel =C3=81lvarez de la Concepci=C3=B3n = I have intalled Hadoop in CentOS (Linux) and the test code is running on = Windows. Do I need cygwin to run the test code? =20 De: Jeff Zhang [mailto:zjffdu@gmail.com]=20 Enviado el: mi=C3=A9rcoles, 10 de marzo de 2010 14:10 Para: hdfs-user@hadoop.apache.org Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using = Java =20 It seems you are running it in windows, then you should install cygwin, = and add C:/cygwin/bin on the Path environment variable . 2010/3/10 Miguel =C3=81ngel =C3=81lvarez de la Concepci=C3=B3n = Thanks! =20 Now, the error occurs after copying the remote file I uploaded before: =20 java.io.IOException: Cannot run program "chmod": CreateProcess = error=3D2, The system can=E2=80=99t find the specified file at java.lang.ProcessBuilder.start(ProcessBuilder.java:459) at org.apache.hadoop.util.Shell.runCommand(Shell.java:149) at org.apache.hadoop.util.Shell.run(Shell.java:134) at = org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)= at org.apache.hadoop.util.Shell.execCommand(Shell.java:354) at org.apache.hadoop.util.Shell.execCommand(Shell.java:337) at = org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.ja= va:481) at = org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.= java:473) at = org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java= :280) at = org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:37= 2) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208) at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142) at = org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216) at = org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197) at hadoop.testHadoop.main(testHadoop.java:53) Caused by: java.io.IOException: CreateProcess error=3D2, The system = can=E2=80=99t find the specified file at java.lang.ProcessImpl.create(Native Method) at java.lang.ProcessImpl.(ProcessImpl.java:81) at java.lang.ProcessImpl.start(ProcessImpl.java:30) at java.lang.ProcessBuilder.start(ProcessBuilder.java:452) ... 17 more =20 Thanks again for your help. =20 De: Jeff Zhang [mailto:zjffdu@gmail.com]=20 Enviado el: martes, 09 de marzo de 2010 17:13 Para: hdfs-user@hadoop.apache.org Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval Using = Java =20 add ugi configuration like this : conf.set("hadoop.job.ugi",your_hadoop_user_name+","+your_hadoop_group_nam= e); 2010/3/9 Miguel =C3=81ngel =C3=81lvarez de la Concepci=C3=B3n = Hi, =20 I tried to run the Java code and it doesn't work. =20 I pasted the code below: =20 public class testHadoop { public static final String DIR_HADOOP =3D "hdfs://my.machine.com"; public static final String PORT_HADOOP =3D "9000"; =20 public static void main(String[] args) { Configuration config =3D new Configuration(); config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP); =20 try { FileSystem haddopFileSystem =3D FileSystem.get(config); =20 String directory =3D "test"; Path hadoopDirectory =3D new = Path(haddopFileSystem.getWorkingDirectory() + "/" + directory); =20 haddopFileSystem.mkdirs(hadoopDirectory); =20 Path sourceDirectory =3D new = Path("C://Windows/media/ringout.wav"); =20 haddopFileSystem.copyFromLocalFile(sourceDirectory, = hadoopDirectory); =20 Path sourceFile =3D new = Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav"); Path targetDirectory =3D new Path("C://"); =20 haddopFileSystem.copyToLocalFile(sourceFile, = targetDirectory); =20 haddopFileSystem.delete(hadoopDirectory, true); } catch(IOException ex) { = Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, = ex); } } } =20 The result of this code is an exception: =20 org.apache.hadoop.security.AccessControlException: = org.apache.hadoop.security.AccessControlException: Permission denied: = user=3Dvaradero\miguelangel, access=3DWRITE, = inode=3D"tmp":root:supergroup:rwxr-xr-x at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native = Method) at = sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAc= cessorImpl.java:39) at = sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConst= ructorAccessorImpl.java:27) at = java.lang.reflect.Constructor.newInstance(Constructor.java:513) at = org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteExceptio= n.java:96) at = org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteExcepti= on.java:58) at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914) at = org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem= .java:262) at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120) at hadoop.testHadoop.main(testHadoop.java:37) Caused by: org.apache.hadoop.ipc.RemoteException: = org.apache.hadoop.security.AccessControlException: Permission denied: = user=3Dvaradero\miguelangel, access=3DWRITE, = inode=3D"tmp":root:supergroup:rwxr-xr-x at = org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permission= Checker.java:176) at = org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permission= Checker.java:157) at = org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(= PermissionChecker.java:105) at = org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNam= esystem.java:4514) at = org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(F= SNamesystem.java:4484) at = org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSName= system.java:1766) at = org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.j= ava:1735) at = org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)= at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at = sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :39) at = sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) =20 at org.apache.hadoop.ipc.Client.call(Client.java:740) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) at $Proxy0.mkdirs(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at = sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :39) at = sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at = org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvoc= ationHandler.java:82) at = org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationH= andler.java:59) at $Proxy0.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912) ... 3 more =20 What's happend? =20 Miguel =C3=81ngel =C3=81lvarez de la Concepci=C3=B3n Departamento de Lenguajes y Sistemas Inform=C3=A1ticos Escuela T=C3=A9cnica Superior de Ingenier=C3=ADa Inform=C3=A1tica Universidad de Sevilla Tel=C3=A9fono: 954.556.086 Email: maalvarez@us.es =20 --=20 Best Regards Jeff Zhang --=20 Best Regards Jeff Zhang --=20 Best Regards Jeff Zhang ------=_NextPart_000_0050_01CAC071.06950F70 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable

I solved the problem by reading from a = stream.

I write the whole code for anyone who wants to see = it:

 

public class testHadoop {

=C2=A0=C2=A0=C2=A0 public static final String DIR_HADOOP = =3D "hdfs://my.machine.com";

=C2=A0=C2=A0=C2=A0 public static final String PORT_HADOOP = =3D "9000";

 

=C2=A0=C2=A0=C2=A0 public static void main(String[] args) = {

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 Configuration = config =3D new Configuration();

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = config.set("hadoop.job.ugi", "root, supergroup");

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 try = {

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 FileSystem hadoopFileSystem =3D FileSystem.get(config);

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 Path hadoopDirectory =3D new = Path(hadoopFileSystem.getWorkingDirectory() + "/test");

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 hadoopFileSystem.mkdirs(hadoopDirec= tory);

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 Path directorioOrigen =3D new Path("C://Windows/media/ringout.wav");

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 hadoopFileSystem.copyFromLocalFile(directorioOrigen, hadoopDirectory);

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 Path ficheroOrigen =3D new Path(hadoopFileSystem.getWorkingDirectory() + = "/test/ringout.wav");

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 FSDataInputStream in =3D hadoopFileSystem.open(ficheroOrigen);

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 FileOutputStream out =3D new FileOutputStream("C://ringout.wav");

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 while(in.available() > 0) {

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = out.write(in.readByte());

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 }

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 out.close();

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 in.close();

 

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 hadoopFileSystem.delete= (hadoopDirectory, true);

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 } = catch(IOException ex) {

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, ex);

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = }

=C2=A0=C2=A0=C2=A0 }

}

 

Thanks for all!

 

De: Jeff Zhang [mailto:zjffdu@gmail.com]
Enviado el: mi=C3=A9rcoles, 10 de marzo de 2010 15:22
Para: hdfs-user@hadoop.apache.org
Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval = Using Java

 

Yes, I think = so.

2010/3/10 Miguel =C3=81ngel =C3=81lvarez de la = Concepci=C3=B3n <maalvarez@us.es>

I have intalled = Hadoop in CentOS (Linux) and the test code is running on = Windows.

Do I need cygwin = to run the test code?

 

De: Jeff Zhang [mailto:zjffdu@gmail.com]
Enviado el: mi=C3=A9rcoles, 10 de marzo de 2010 = 14:10


Para: hdfs-user@hadoop.apache.org
Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval = Using Java

 <= /o:p>

It seems you are running it in windows, then you should install cygwin, and = add C:/cygwin/bin on the Path environment variable .

2010/3/10 Miguel =C3=81ngel =C3=81lvarez de la Concepci=C3=B3n <maalvarez@us.es>

Thanks!

 

Now, the error = occurs after copying the remote file I uploaded before:

 

java.io.IOException: Cannot run program "chmod": CreateProcess error=3D2, The system = can=E2=80=99t find the specified file

     &n= bsp;  at = java.lang.ProcessBuilder.start(ProcessBuilder.java:459)=

     &n= bsp;  at = org.apache.hadoop.util.Shell.runCommand(Shell.java:149)=

     &n= bsp;  at = org.apache.hadoop.util.Shell.run(Shell.java:134)

     &n= bsp;  at = org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)=

     &n= bsp;  at = org.apache.hadoop.util.Shell.execCommand(Shell.java:354)

     &n= bsp;  at = org.apache.hadoop.util.Shell.execCommand(Shell.java:337)

     &n= bsp;  at = org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.ja= va:481)

     &n= bsp;  at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.= java:473)

        at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java= :280)

     &n= bsp;  at = org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:37= 2)

     &n= bsp;  at = org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)

     &n= bsp;  at = org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)

     &n= bsp;  at = org.apache.hadoop.fs.FileSystem.create(FileSystem.java:372)

     &n= bsp;  at = org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:208)

     &n= bsp;  at = org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:142)

     &n= bsp;  at = org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1216)

     &n= bsp;  at = org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1197)

     &n= bsp;  at hadoop.testHadoop.main(testHadoop.java:53)

Caused by: java.io.IOException: CreateProcess error=3D2, The system can=E2=80=99t = find the specified file

     &n= bsp;  at java.lang.ProcessImpl.create(Native Method)

     &n= bsp;  at = java.lang.ProcessImpl.<init>(ProcessImpl.java:81)=

     &n= bsp;  at = java.lang.ProcessImpl.start(ProcessImpl.java:30)

        at = java.lang.ProcessBuilder.start(ProcessBuilder.java:452)=

     &n= bsp;  ... 17 more

 

Thanks again for = your help.

 

De: Jeff Zhang [mailto:zjffdu@gmail.com]
Enviado el: martes, 09 de marzo de 2010 17:13
Para: hdfs-user@hadoop.apache.org
Asunto: Re: Accessing Hadoop DFS for Data Storage and Retrieval = Using Java

 <= /o:p>


add ugi configuration like this :
conf.set("hadoop.job.ugi",your_hadoop_user_name+","+y= our_hadoop_group_name);

2010/3/9 Miguel =C3=81ngel =C3=81lvarez de la Concepci=C3=B3n <maalvarez@us.es>

Hi,

 

I tried to run the Java code and it doesn't = work.

 

I pasted the code below:

 

public class testHadoop {

    public static final String DIR_HADOOP = =3D "hdfs://my.machine.com";

    public static final String PORT_HADOOP = =3D "9000";

 

    public static void main(String[] args) = {

        Configuration = config =3D new Configuration();

        = config.set("fs.default.name", DIR_HADOOP + ":" + PORT_HADOOP);

 

        try = {

          =   FileSystem haddopFileSystem =3D = FileSystem.get(config);

 

          =   String directory =3D "test";

          =   Path hadoopDirectory =3D new Path(haddopFileSystem.getWorkingDirectory() = + "/" + directory);

 

          =   haddopFileSystem.mkdirs(hadoopDirectory);

 

          =   Path sourceDirectory =3D new = Path("C://Windows/media/ringout.wav");

 

          =   haddopFileSystem.copyFromLocalFile(sourceDirectory, = hadoopDirectory);

 

          =   Path sourceFile =3D new Path(haddopFileSystem.getWorkingDirectory() + "/test/ringout.wav");

          =   Path targetDirectory =3D new = Path("C://");

 

          =   haddopFileSystem.copyToLocalFile(sourceFile, targetDirectory);

 

          =   haddopFileSystem.delete(hadoopDirectory, true);

        } = catch(IOException ex) {

          =   Logger.getLogger(testHadoop.class.getName()).log(Level.SEVERE, null, = ex);

        = }

    }

}

 

The result of this code is an = exception:

 

org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=3Dvaradero\miguelangel, access=3DWRITE, inode=3D"tmp":root:supergroup:rwxr-xr-x

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native = Method)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAc= cessorImpl.java:39)

        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConst= ructorAccessorImpl.java:27)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)

        at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteExceptio= n.java:96)

        at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteExcepti= on.java:58)

        at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:914)

        at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem= .java:262)

        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1120)<= /o:p>

        at hadoop.testHadoop.main(testHadoop.java:37)

Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=3Dvaradero\miguelangel, access=3DWRITE, = inode=3D"tmp":root:supergroup:rwxr-xr-x

        at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permission= Checker.java:176)

        at org.apache.hadoop.hdfs.server.namenode.PermissionChecker.check(Permission= Checker.java:157)

        at = org.apache.hadoop.hdfs.server.namenode.PermissionChecker.checkPermission(= PermissionChecker.java:105)

        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNam= esystem.java:4514)

        at = org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(F= SNamesystem.java:4484)

        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSName= system.java:1766)

        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.j= ava:1735)

        at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:542)=

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native = Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :39)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:25)

        at java.lang.reflect.Method.invoke(Method.java:597)

        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)

=

        at = org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)

        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)

        at java.security.AccessController.doPrivileged(Native = Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

 

        at org.apache.hadoop.ipc.Client.call(Client.java:740)

        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)<= /p>

        at = $Proxy0.mkdirs(Unknown Source)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native = Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java= :39)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI= mpl.java:25)

        at java.lang.reflect.Method.invoke(Method.java:597)

        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvoc= ationHandler.java:82)

        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationH= andler.java:59)

        at = $Proxy0.mkdirs(Unknown Source)

        at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:912)

        ... 3 = more

 <= /o:p>

What's happend?

 <= /o:p>

Miguel =C3=81ngel =C3=81lvarez de la Concepci=C3=B3n

Departamento= de Lenguajes y Sistemas Inform=C3=A1ticos

Escuela T=C3=A9cnica Superior de Ingenier=C3=ADa Inform=C3=A1tica

Universidad de Sevilla

Tel=C3=A9fon= o: 954.556.086

Email: maalvarez@us.es

 <= /o:p>




--
Best Regards

Jeff Zhang




--
Best Regards

Jeff Zhang




--
Best Regards

Jeff Zhang

------=_NextPart_000_0050_01CAC071.06950F70--