hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Venkatesh (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HADOOP-10969) RawLocalFileSystem.setPermission throws Exception
Date Wed, 13 Aug 2014 19:50:12 GMT
Venkatesh created HADOOP-10969:
----------------------------------

             Summary: RawLocalFileSystem.setPermission throws Exception
                 Key: HADOOP-10969
                 URL: https://issues.apache.org/jira/browse/HADOOP-10969
             Project: Hadoop Common
          Issue Type: Bug
         Environment: hadoop 2.3.0, Windows Environment, Development using Eclipse, Lenevo
Laptop
            Reporter: Venkatesh
            Priority: Blocker


I'm an application developer. We recently moved from CDH4.7 to CDH5.1. The hadoop version
have been from 1.x to 2.x. In order to perform development on Eclipse (on WINDOWS), the following
class was created 

public class WindowsLocalFileSystem extends LocalFileSystem {

	public WindowsLocalFileSystem() {
		super();
	}
	@Override
	public boolean mkdirs(Path f, FsPermission permission) throws IOException {
		final boolean result = super.mkdirs(f);
		this.setPermission(f, permission);
		return result;
		
	}

	@Override
	public void setPermission(Path p, FsPermission permission)
			throws IOException {
		try {
		super.setPermission(p, permission);
		} catch (final IOException e) {
			System.err.println("Cant help it, hence ignoring IOException setting persmission for path
\"" + p +
					 "\": " + e.getMessage());
		}
	}

}

This class was used in MapReduce Job as

		if (RUN_LOCAL) {
			conf.set("fs.default.name", "file:///");
			conf.set("mapred.job.tracker", "local");
			conf.set("fs.file.impl",
					"org.scif.bdp.mrjobs.WindowsLocalFileSystem");
			conf.set(
					"io.serializations",
					"org.apache.hadoop.io.serializer.JavaSerialization,"
							+ "org.apache.hadoop.io.serializer.WritableSerialization");

		}
It worked fine on CDH4.7. Now the same code when compiled on CDH5.1 works but when I try to
execute it throws the following stacktrace

Exception in thread "main" java.lang.NullPointerException
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:451)
	at org.apache.hadoop.util.Shell.run(Shell.java:424)
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:656)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:745)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:728)
	at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:633)
	at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:467)
	at com.scif.bdp.common.WindowsLocalFileSystem.setPermission(WindowsLocalFileSystem.java:26)
	at com.scif.bdp.common.WindowsLocalFileSystem.mkdirs(WindowsLocalFileSystem.java:17)
	at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
	at com.scif.bdp.mrjobs.DeDup.run(DeDup.java:55)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
	at com.scif.bdp.mrjobs.DeDup.main(DeDup.java:59)

(Note DeDup is my MR class to remove duplicates)

Upon investigation the only change I saw was the change in method .setPermission(). It invokes
Native.POSIX.chmod as against Native.chmod







--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message