hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Steve Loughran (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HADOOP-11363) Hadoop maven surefire-plugin uses must set heap size
Date Mon, 08 Dec 2014 22:58:13 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-11363?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Steve Loughran updated HADOOP-11363:
------------------------------------
    Attachment: HADOOP-11363-002.patch

Patch -002 goes to 4G heap

one thing to note is that this will also be the heap requirement on local dev systems, though
if people do want to tune it down they can set the {{maven-surefire-plugin.argLine}} property
to a different  value

> Hadoop maven surefire-plugin uses must set heap size
> ----------------------------------------------------
>
>                 Key: HADOOP-11363
>                 URL: https://issues.apache.org/jira/browse/HADOOP-11363
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 2.7.0
>         Environment: java 8
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>         Attachments: HADOOP-11363-001.patch, HADOOP-11363-002.patch
>
>
> Some of the hadoop tests (especially HBase) are running out of memory on Java 8, due
to there not being enough heap for them
> The heap size of surefire test runs is *not* set in {{MAVEN_OPTS}}, it needs to be explicitly
set as an argument to the test run.
> I propose
> # {{hadoop-project/pom.xml}} defines the maximum heap size and test timeouts for surefire
builds as properties
> # modules which run tests use these values for their memory & timeout settings.
> # these modules should also set the surefire version they want to use



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message