spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "James Gan (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-6190) create LargeByteBuffer abstraction for eliminating 2GB limit on blocks
Date Tue, 29 May 2018 03:41:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-6190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16493061#comment-16493061
] 

James Gan commented on SPARK-6190:
----------------------------------

Thank, [~irashid]. It's great to see that your test shows most important issues are fixed
in Spark 2.3 already.

> create LargeByteBuffer abstraction for eliminating 2GB limit on blocks
> ----------------------------------------------------------------------
>
>                 Key: SPARK-6190
>                 URL: https://issues.apache.org/jira/browse/SPARK-6190
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>            Reporter: Imran Rashid
>            Assignee: Josh Rosen
>            Priority: Major
>         Attachments: LargeByteBuffer_v3.pdf
>
>
> A key component in eliminating the 2GB limit on blocks is creating a proper abstraction
for storing more than 2GB.  Currently spark is limited by a reliance on nio ByteBuffer and
netty ByteBuf, both of which are limited at 2GB.  This task will introduce the new abstraction
and the relevant implementation and utilities, without effecting the existing implementation
at all.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message