spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Randy Tidd (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-22296) CodeGenerator - failed to compile when constructor has scala.collection.mutable.Seq vs. scala.collection.Seq
Date Fri, 20 Oct 2017 18:33:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-22296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16213015#comment-16213015
] 

Randy Tidd commented on SPARK-22296:
------------------------------------

Thank you I was just installing 2.2.0 and related components to try this but you beat me to
it.  Glad to hear it's fixed.

> CodeGenerator - failed to compile when constructor has scala.collection.mutable.Seq vs.
scala.collection.Seq
> ------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-22296
>                 URL: https://issues.apache.org/jira/browse/SPARK-22296
>             Project: Spark
>          Issue Type: Bug
>          Components: Optimizer
>    Affects Versions: 2.1.0
>            Reporter: Randy Tidd
>
> This is with Scala 2.11.
> We have a case class that has a constructor with 85 args, the last two of which are:
>                      var chargesInst : scala.collection.mutable.Seq[ChargeInstitutional]
= scala.collection.mutable.Seq.empty[ChargeInstitutional],
>                      var chargesProf : scala.collection.mutable.Seq[ChargeProfessional]
= scala.collection.mutable.Seq.empty[ChargeProfessional]
> A mutable Seq in a the constructor of a case class is probably poor form but Scala allows
it.  When we run this job we get this error:
> build   17-Oct-2017 05:30:50        2017-10-17 09:30:50 [Executor task launch worker-1]
ERROR o.a.s.s.c.e.codegen.CodeGenerator - failed to compile: org.codehaus.commons.compiler.CompileException:
File 'generated.java', Line 8217, Column 70: No applicable constructor/method found for actual
parameters "java.lang.String, java.lang.String, long, java.lang.String, long, long, long,
java.lang.String, long, long, double, scala.Option, scala.Option, java.lang.String, java.lang.String,
long, java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
java.lang.String, int, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
java.lang.String, long, long, long, long, long, scala.Option, scala.Option, scala.Option,
scala.Option, scala.Option, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
java.lang.String, java.lang.String, long, java.lang.String, int, double, double, java.lang.String,
java.lang.String, java.lang.String, long, java.lang.String, java.lang.String, java.lang.String,
java.lang.String, long, long, long, long, java.lang.String, com.xyz.xyz.xyz.domain.Patient,
com.xyz.xyz.xyz.domain.Physician, scala.collection.Seq, scala.collection.Seq, java.lang.String,
long, java.lang.String, int, int, boolean, boolean, scala.collection.Seq, boolean, scala.collection.Seq,
boolean, scala.collection.Seq, scala.collection.Seq"; candidates are: "com.xyz.xyz.xyz.domain.Account(java.lang.String,
java.lang.String, long, java.lang.String, long, long, long, java.lang.String, long, long,
double, scala.Option, scala.Option, java.lang.String, java.lang.String, long, java.lang.String,
java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
int, java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
long, long, long, long, long, scala.Option, scala.Option, scala.Option, scala.Option, scala.Option,
java.lang.String, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
java.lang.String, long, java.lang.String, int, double, double, java.lang.String, java.lang.String,
java.lang.String, long, java.lang.String, java.lang.String, java.lang.String, java.lang.String,
long, long, long, long, java.lang.String, com.xyz.xyz.xyz.domain.Patient, com.xyz.xyz.xyz.domain.Physician,
scala.collection.Seq, scala.collection.Seq, java.lang.String, long, java.lang.String, int,
int, boolean, boolean, scala.collection.Seq, boolean, scala.collection.Seq, boolean, scala.collection.mutable.Seq,
scala.collection.mutable.Seq)"
> The relevant lines are:
> build   17-Oct-2017 05:30:50        /* 093 */   private scala.collection.Seq argValue84;
> build   17-Oct-2017 05:30:50        /* 094 */   private scala.collection.Seq argValue85;
> and
> build   17-Oct-2017 05:30:54        /* 8217 */     final com.xyz.xyz.xyz.domain.Account
value1 = false ? null : new com.xyz.xyz.xyz.domain.Account(argValue2, argValue3, argValue4,
argValue5, argValue6, argValue7, argValue8, argValue9, argValue10, argValue11, argValue12,
argValue13, argValue14, argValue15, argValue16, argValue17, argValue18, argValue19, argValue20,
argValue21, argValue22, argValue23, argValue24, argValue25, argValue26, argValue27, argValue28,
argValue29, argValue30, argValue31, argValue32, argValue33, argValue34, argValue35, argValue36,
argValue37, argValue38, argValue39, argValue40, argValue41, argValue42, argValue43, argValue44,
argValue45, argValue46, argValue47, argValue48, argValue49, argValue50, argValue51, argValue52,
argValue53, argValue54, argValue55, argValue56, argValue57, argValue58, argValue59, argValue60,
argValue61, argValue62, argValue63, argValue64, argValue65, argValue66, argValue67, argValue68,
argValue69, argValue70, argValue71, argValue72, argValue73, argValue74, argValue75, argValue76,
argValue77, argValue78, argValue79, argValue80, argValue81, argValue82, argValue83, argValue84,
argValue85);
> In short, Spark uses scala.collection.Seq in the generated code which is not compatible
with scala.collection.mutable.Seq in our case class, which results in a failure at runtime.




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message