hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yiping Han (JIRA)" <j...@apache.org>
Subject [jira] Commented: (HADOOP-1864) Support for big jar file (>2G)
Date Thu, 25 Oct 2007 22:29:50 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-1864?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12537755

Yiping Han commented on HADOOP-1864:


Yes. Either this issue or 2019 should satisfy our requirement.

> Support for big jar file (>2G)
> ------------------------------
>                 Key: HADOOP-1864
>                 URL: https://issues.apache.org/jira/browse/HADOOP-1864
>             Project: Hadoop
>          Issue Type: Bug
>          Components: contrib/streaming
>    Affects Versions: 0.14.1
>            Reporter: Yiping Han
>            Priority: Critical
> We have huge size binary that need to be distributed onto tasktracker nodes in Hadoop
streaming mode. We've tried both -file option and -cacheArchive option. It seems the tasktracker
node cannot unjar jar files bigger than 2G. We are considering split our binaries into multiple
jars, but with -file, it seems we cannot do it. Also, we would prefer -cacheArchive option
for performance issue, but it seems -cacheArchive does not allow more than appearance in the
streaming options. Even if -cacheArchive support multiple jars, we still need a way to put
the jars into a single directory tree, instead of using multiple symbolic links. 
> So, in general, we need a feasible and efficient way to update large size (>2G) binaries
for Hadoop streaming. Don't know if there is an existing solution that we either didn't find
or took it wrong. Or there should be some extra work to provide a solution?

This message is automatically generated by JIRA.
You can reply to this email to add a comment to the issue online.

View raw message