spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Zhong (JIRA)" <>
Subject [jira] [Commented] (SPARK-15786) joinWith bytecode generation calling ByteBuffer.wrap with InternalRow
Date Thu, 16 Jun 2016 21:42:05 GMT


Sean Zhong commented on SPARK-15786:

[~yhuai] Sure, we definitely can improve it.

> joinWith bytecode generation calling ByteBuffer.wrap with InternalRow
> ---------------------------------------------------------------------
>                 Key: SPARK-15786
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1, 2.0.0
>            Reporter: Richard Marscher
>            Assignee: Sean Zhong
>             Fix For: 2.0.0
> {code}java.lang.RuntimeException: Error while decoding: java.util.concurrent.ExecutionException:
java.lang.Exception: failed to compile: org.codehaus.commons.compiler.CompileException: File
'', Line 36, Column 107: No applicable constructor/method found for actual parameters
"org.apache.spark.sql.catalyst.InternalRow"; candidates are: "public static java.nio.ByteBuffer
java.nio.ByteBuffer.wrap(byte[])", "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[],
int, int)"{code}
> I have been trying to use joinWith along with Option data types to get an approximation
of the RDD semantics for outer joins with Dataset to have a nicer API for Scala. However,
using the[] syntax leads to bytecode generation trying to pass an InternalRow object
into the ByteBuffer.wrap function which expects byte[] with or without a couple int qualifiers.
> I have a notebook reproducing this against 2.0 preview in Databricks Community Edition:

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message