hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nigel Daley (JIRA)" <j...@apache.org>
Subject [jira] Created: (HADOOP-1629) Block CRC Unit Tests: upgrade test
Date Wed, 18 Jul 2007 03:42:04 GMT
Block CRC Unit Tests: upgrade test
----------------------------------

                 Key: HADOOP-1629
                 URL: https://issues.apache.org/jira/browse/HADOOP-1629
             Project: Hadoop
          Issue Type: Test
          Components: dfs
    Affects Versions: 0.14.0
            Reporter: Nigel Daley
            Priority: Blocker
             Fix For: 0.14.0


HADOOP-1286 introduced a distributed upgrade framework.  1 or more unit tests should be developed
that start with a zipped up Hadoop 0.12 file system (that is included in Hadoop's src/test
directory under version controlled) and attempts to upgrade it to the current version of Hadoop
(ie the version that the tests are running against).  The zipped up file system should include
some "interesting" files, such as:

- zero length files
- file with replication set higher than number of datanodes
- file with no .crc file
- file with corrupt .crc file
- file with multiple blocks (will need to set dfs.block.size to a small value)
- file with multiple checksum blocks
- empty directory
- all of the above again but with a different io.bytes.per.checksum setting

The class that generates the zipped up file system should also be included in this patch.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message