spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Richard Marscher (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-15786) Spark SQL - joinWith bytecode generation calling ByteBuffer.wrap with InternalRow
Date Mon, 06 Jun 2016 18:56:21 GMT
Richard Marscher created SPARK-15786:
----------------------------------------

             Summary: Spark SQL - joinWith bytecode generation calling ByteBuffer.wrap with
InternalRow
                 Key: SPARK-15786
                 URL: https://issues.apache.org/jira/browse/SPARK-15786
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.6.1, 2.0.0
            Reporter: Richard Marscher


{code}java.lang.RuntimeException: Error while decoding: java.util.concurrent.ExecutionException:
java.lang.Exception: failed to compile: org.codehaus.commons.compiler.CompileException: File
'generated.java', Line 36, Column 107: No applicable constructor/method found for actual parameters
"org.apache.spark.sql.catalyst.InternalRow"; candidates are: "public static java.nio.ByteBuffer
java.nio.ByteBuffer.wrap(byte[])", "public static java.nio.ByteBuffer java.nio.ByteBuffer.wrap(byte[],
int, int)"{code}

I have been trying to use joinWith along with Option data types to get an approximation of
the RDD semantics for outer joins with Dataset to have a nicer API for Scala. However, using
the Dataset.as[] syntax leads to bytecode generation trying to pass an InternalRow object
into the ByteBuffer.wrap function which expects byte[] with or without a couple int qualifiers.

I have a notebook reproducing this against 2.0 preview in Databricks Community Edition: https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/160347920874755/1039589581260901/673639177603143/latest.html



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message