hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Colin Patrick McCabe (Created) (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HDFS-3134) harden edit log loader against malformed or malicious input
Date Fri, 23 Mar 2012 20:33:28 GMT
harden edit log loader against malformed or malicious input

                 Key: HDFS-3134
                 URL: https://issues.apache.org/jira/browse/HDFS-3134
             Project: Hadoop HDFS
          Issue Type: Bug
            Reporter: Colin Patrick McCabe
            Assignee: Colin Patrick McCabe

Currently, the edit log loader does not handle bad or malicious input sensibly.

We can often cause OutOfMemory exceptions, null pointer exceptions, or other unchecked exceptions
to be thrown by feeding the edit log loader bad input.  In some environments, an out of memory
error can cause the JVM process to be terminated.

It's clear that we want these exceptions to be thrown as IOException instead of as unchecked
exceptions.  We also want to avoid out of memory situations.

The main task here is to put a sensible upper limit on the lengths of arrays and strings we
allocate on command.  The other task is to try to avoid creating unchecked exceptions (by
dereferencing potentially-NULL pointers, for example).  Instead, we should verify ahead of
time and give a more sensible error message that reflects the problem with the input.

This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira


View raw message