spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 金国栋 <scu...@gmail.com>
Subject Spark build error
Date Wed, 18 Nov 2015 02:45:36 GMT
Hi!

I tried to build spark source code from github, and I successfully built it
from command line using `*sbt/sbt assembly*`. While I encountered an error
when compiling the project in Intellij IDEA(V14.1.5).


The error log is below:
*Error:scala: *
*     while compiling:
/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala*
*        during phase: jvm*
     library version: version 2.10.5
    compiler version: version 2.10.5
  reconstructed args: -nobootcp -javabootclasspath : -deprecation -feature
-classpath
/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/lib/tools.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/htmlconverter.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_05.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/core/target/scala-2.10/classes:/Users/ray/Documents/P01_Project/Spark-Github/spark/core/target/scala-2.10/classes:/Users/ray/.m2/repository/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7-hadoop2.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7.jar:/Users/ray/.m2/repository/org/apache/avro/avro-ipc/1.7.7/avro-ipc-1.7.7-tests.jar:/Users/ray/.m2/repository/com/twitter/chill_2.10/0.5.0/chill_2.10-0.5.0.jar:/Users/ray/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/Users/ray/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/Users/ray/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/ray/.m2/repository/com/twitter/chill-java/0.5.0/chill-java-0.5.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-client/2.2.0/hadoop-client-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0.jar:/Users/ray/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/ray/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/Users/ray/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/ray/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/ray/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/ray/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/ray/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/Users/ray/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-auth/2.2.0/hadoop-auth-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.2.0/hadoop-hdfs-2.2.0.jar:/Users/ray/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.2.0/hadoop-mapreduce-client-app-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.2.0/hadoop-mapreduce-client-common-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.2.0/hadoop-yarn-client-2.2.0.jar:/Users/ray/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/Users/ray/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/Users/ray/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/ray/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-grizzly2/1.9/jersey-test-framework-grizzly2-1.9.jar:/Users/ray/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-core/1.9/jersey-test-framework-core-1.9.jar:/Users/ray/.m2/repository/javax/servlet/javax.servlet-api/3.0.1/javax.servlet-api-3.0.1.jar:/Users/ray/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/ray/.m2/repository/com/sun/jersey/jersey-grizzly2/1.9/jersey-grizzly2-1.9.jar:/Users/ray/.m2/repository/org/glassfish/grizzly/grizzly-http/2.1.2/grizzly-http-2.1.2.jar:/Users/ray/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.2/grizzly-framework-2.1.2.jar:/Users/ray/.m2/repository/org/glassfish/gmbal/gmbal-api-only/3.0.0-b023/gmbal-api-only-3.0.0-b023.jar:/Users/ray/.m2/repository/org/glassfish/external/management-api/3.0.0-b012/management-api-3.0.0-b012.jar:/Users/ray/.m2/repository/org/glassfish/grizzly/grizzly-http-server/2.1.2/grizzly-http-server-2.1.2.jar:/Users/ray/.m2/repository/org/glassfish/grizzly/grizzly-rcm/2.1.2/grizzly-rcm-2.1.2.jar:/Users/ray/.m2/repository/org/glassfish/grizzly/grizzly-http-servlet/2.1.2/grizzly-http-servlet-2.1.2.jar:/Users/ray/.m2/repository/org/glassfish/javax.servlet/3.1/javax.servlet-3.1.jar:/Users/ray/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/ray/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/ray/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/ray/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/ray/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/ray/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar:/Users/ray/.m2/repository/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar:/Users/ray/.m2/repository/com/sun/jersey/contribs/jersey-guice/1.9/jersey-guice-1.9.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.2.0/hadoop-yarn-server-common-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.2.0/hadoop-mapreduce-client-shuffle-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.2.0/hadoop-yarn-api-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.2.0/hadoop-mapreduce-client-core-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.2.0/hadoop-yarn-common-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.2.0/hadoop-mapreduce-client-jobclient-2.2.0.jar:/Users/ray/.m2/repository/org/apache/hadoop/hadoop-annotations/2.2.0/hadoop-annotations-2.2.0.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/launcher/target/scala-2.10/classes:/Users/ray/Documents/P01_Project/Spark-Github/spark/network/common/target/scala-2.10/classes:/Users/ray/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/network/shuffle/target/scala-2.10/classes:/Users/ray/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/unsafe/target/scala-2.10/classes:/Users/ray/.m2/repository/net/java/dev/jets3t/jets3t/0.7.1/jets3t-0.7.1.jar:/Users/ray/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/ray/.m2/repository/org/apache/curator/curator-recipes/2.4.0/curator-recipes-2.4.0.jar:/Users/ray/.m2/repository/org/apache/curator/curator-framework/2.4.0/curator-framework-2.4.0.jar:/Users/ray/.m2/repository/org/apache/curator/curator-client/2.4.0/curator-client-2.4.0.jar:/Users/ray/.m2/repository/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar:/Users/ray/.m2/repository/jline/jline/0.9.94/jline-0.9.94.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-plus/8.1.14.v20131031/jetty-plus-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/orbit/javax.transaction/1.1.1.v201105210645/javax.transaction-1.1.1.v201105210645.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-webapp/8.1.14.v20131031/jetty-webapp-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-xml/8.1.14.v20131031/jetty-xml-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-jndi/8.1.14.v20131031/jetty-jndi-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/orbit/javax.mail.glassfish/1.4.1.v201005082020/javax.mail.glassfish-1.4.1.v201005082020.jar:/Users/ray/.m2/repository/org/eclipse/jetty/orbit/javax.activation/1.1.0.v201105071233/javax.activation-1.1.0.v201105071233.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-security/8.1.14.v20131031/jetty-security-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-util/8.1.14.v20131031/jetty-util-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-server/8.1.14.v20131031/jetty-server-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-http/8.1.14.v20131031/jetty-http-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-io/8.1.14.v20131031/jetty-io-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-continuation/8.1.14.v20131031/jetty-continuation-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/jetty-servlet/8.1.14.v20131031/jetty-servlet-8.1.14.v20131031.jar:/Users/ray/.m2/repository/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.jar:/Users/ray/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/ray/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/Users/ray/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/ray/.m2/repository/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar:/Users/ray/.m2/repository/org/slf4j/jul-to-slf4j/1.7.10/jul-to-slf4j-1.7.10.jar:/Users/ray/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.10/jcl-over-slf4j-1.7.10.jar:/Users/ray/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/ray/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar:/Users/ray/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/Users/ray/.m2/repository/org/xerial/snappy/snappy-java/1.1.2/snappy-java-1.1.2.jar:/Users/ray/.m2/repository/net/jpountz/lz4/lz4/1.3.0/lz4-1.3.0.jar:/Users/ray/.m2/repository/org/roaringbitmap/RoaringBitmap/0.4.5/RoaringBitmap-0.4.5.jar:/Users/ray/.m2/repository/commons-net/commons-net/2.2/commons-net-2.2.jar:/Users/ray/.m2/repository/com/typesafe/akka/akka-remote_2.10/2.3.11/akka-remote_2.10-2.3.11.jar:/Users/ray/.m2/repository/com/typesafe/akka/akka-actor_2.10/2.3.11/akka-actor_2.10-2.3.11.jar:/Users/ray/.m2/repository/com/typesafe/config/1.2.1/config-1.2.1.jar:/Users/ray/.m2/repository/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar:/Users/ray/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/ray/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/Users/ray/.m2/repository/com/typesafe/akka/akka-slf4j_2.10/2.3.11/akka-slf4j_2.10-2.3.11.jar:/Users/ray/.m2/repository/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.jar:/Users/ray/.m2/repository/org/json4s/json4s-jackson_2.10/3.2.10/json4s-jackson_2.10-3.2.10.jar:/Users/ray/.m2/repository/org/json4s/json4s-core_2.10/3.2.10/json4s-core_2.10-3.2.10.jar:/Users/ray/.m2/repository/org/json4s/json4s-ast_2.10/3.2.10/json4s-ast_2.10-3.2.10.jar:/Users/ray/.m2/repository/org/scala-lang/scalap/2.10.5/scalap-2.10.5.jar:/Users/ray/.m2/repository/org/scala-lang/scala-compiler/2.10.5/scala-compiler-2.10.5.jar:/Users/ray/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/ray/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/ray/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/ray/.m2/repository/org/apache/mesos/mesos/0.21.1/mesos-0.21.1-shaded-protobuf.jar:/Users/ray/.m2/repository/io/netty/netty-all/4.0.29.Final/netty-all-4.0.29.Final.jar:/Users/ray/.m2/repository/com/clearspring/analytics/stream/2.7.0/stream-2.7.0.jar:/Users/ray/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.2/metrics-core-3.1.2.jar:/Users/ray/.m2/repository/io/dropwizard/metrics/metrics-jvm/3.1.2/metrics-jvm-3.1.2.jar:/Users/ray/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.2/metrics-json-3.1.2.jar:/Users/ray/.m2/repository/io/dropwizard/metrics/metrics-graphite/3.1.2/metrics-graphite-3.1.2.jar:/Users/ray/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.10/2.4.4/jackson-module-scala_2.10-2.4.4.jar:/Users/ray/.m2/repository/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar:/Users/ray/.m2/repository/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/Users/ray/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/Users/ray/.m2/repository/org/tachyonproject/tachyon-client/0.8.1/tachyon-client-0.8.1.jar:/Users/ray/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/ray/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/ray/.m2/repository/org/tachyonproject/tachyon-underfs-hdfs/0.8.1/tachyon-underfs-hdfs-0.8.1.jar:/Users/ray/.m2/repository/org/tachyonproject/tachyon-underfs-s3/0.8.1/tachyon-underfs-s3-0.8.1.jar:/Users/ray/.m2/repository/org/tachyonproject/tachyon-underfs-local/0.8.1/tachyon-underfs-local-0.8.1.jar:/Users/ray/.m2/repository/net/razorvine/pyrolite/4.9/pyrolite-4.9.jar:/Users/ray/.m2/repository/net/sf/py4j/py4j/0.9/py4j-0.9.jar:/Users/ray/Documents/P01_Project/Spark-Github/spark/sql/catalyst/target/scala-2.10/classes:/Users/ray/.m2/repository/org/scala-lang/scala-reflect/2.10.5/scala-reflect-2.10.5.jar:/Users/ray/.m2/repository/org/codehaus/janino/janino/2.7.8/janino-2.7.8.jar:/Users/ray/.m2/repository/org/codehaus/janino/commons-compiler/2.7.8/commons-compiler-2.7.8.jar:/Users/ray/.m2/repository/org/apache/parquet/parquet-column/1.7.0/parquet-column-1.7.0.jar:/Users/ray/.m2/repository/org/apache/parquet/parquet-common/1.7.0/parquet-common-1.7.0.jar:/Users/ray/.m2/repository/org/apache/parquet/parquet-encoding/1.7.0/parquet-encoding-1.7.0.jar:/Users/ray/.m2/repository/org/apache/parquet/parquet-generator/1.7.0/parquet-generator-1.7.0.jar:/Users/ray/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/ray/.m2/repository/org/apache/parquet/parquet-hadoop/1.7.0/parquet-hadoop-1.7.0.jar:/Users/ray/.m2/repository/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar:/Users/ray/.m2/repository/org/apache/parquet/parquet-jackson/1.7.0/parquet-jackson-1.7.0.jar:/Users/ray/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/ray/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/ray/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.4.4/jackson-databind-2.4.4.jar:/Users/ray/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.4.4/jackson-annotations-2.4.4.jar:/Users/ray/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.4.4/jackson-core-2.4.4.jar:/Users/ray/.m2/repository/org/apache/avro/avro/1.7.7/avro-1.7.7.jar:/Users/ray/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/ray/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/ray/.m2/repository/org/objenesis/objenesis/1.0/objenesis-1.0.jar:/Users/ray/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar
-unchecked
  last tree to typer:
Literal(Constant(org.apache.spark.sql.test.ExamplePoint))
              symbol: null
   symbol definition: null
                 tpe: Class(classOf[org.apache.spark.sql.test.ExamplePoint])
       symbol owners:
      context owners: class ExamplePointUDT -> package test
== Enclosing template or block ==
Template( // val <local ExamplePointUDT>: <notype> in class
ExamplePointUDT, tree.tpe=org.apache.spark.sql.test.ExamplePointUDT
  "org.apache.spark.sql.types.UserDefinedType" // parents
  ValDef(
    private
    "_"
    <tpt>
    <empty>
  )
  // 11 statements
  DefDef( // override def sqlType(): org.apache.spark.sql.types.DataType in
class ExamplePointUDT
    <method> override
    "sqlType"
    []
    List(Nil)
    <tpt> // tree.tpe=org.apache.spark.sql.types.DataType
    Apply( // def <init>(elementType:
org.apache.spark.sql.types.DataType,containsNull: Boolean):
org.apache.spark.sql.types.ArrayType in class ArrayType,
tree.tpe=org.apache.spark.sql.types.ArrayType
      new org.apache.spark.sql.types.ArrayType."<init>" // def
<init>(elementType: org.apache.spark.sql.types.DataType,containsNull:
Boolean): org.apache.spark.sql.types.ArrayType in class ArrayType,
tree.tpe=(elementType: org.apache.spark.sql.types.DataType, containsNull:
Boolean)org.apache.spark.sql.types.ArrayType
      // 2 arguments
      "org"."apache"."spark"."sql"."types"."DoubleType" // case object
DoubleType in package types,
tree.tpe=org.apache.spark.sql.types.DoubleType.type
      false
    )
  )
  DefDef( // override def pyUDT(): String in class ExamplePointUDT
    <method> override
    "pyUDT"
    []
    List(Nil)
    <tpt> // tree.tpe=String
    "pyspark.sql.tests.ExamplePointUDT"
  )
  DefDef( // override def serialize(obj: Object):
org.apache.spark.sql.catalyst.util.GenericArrayData in class ExamplePointUDT
    <method> override <triedcooking>
    "serialize"
    []
    // 1 parameter list
    ValDef( // obj: Object
      <param> <triedcooking>
      "obj"
      <tpt> // tree.tpe=Object
      <empty>
    )
    <tpt> // tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
    Block( // tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
      // 3 statements
      ValDef( // case val x1: Object
        case <synthetic> <triedcooking>
        "x1"
        <tpt> // tree.tpe=Object
        "obj" // obj: Object, tree.tpe=Object
      )
      LabelDef( // case def case5():
org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
        ()
        If( // tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
          Apply( // final def $isInstanceOf[T0 >: ? <: ?](): Boolean in
class Object, tree.tpe=Boolean
            TypeApply( // final def $isInstanceOf[T0 >: ? <: ?](): Boolean
in class Object, tree.tpe=()Boolean
              "x1"."$isInstanceOf" // final def $isInstanceOf[T0 >: ? <:
?](): Boolean in class Object, tree.tpe=[T0 >: ? <: ?]()Boolean
              <tpt> // tree.tpe=org.apache.spark.sql.test.ExamplePoint
            )
            Nil
          )
          Block( //
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
            ValDef( // val x2: org.apache.spark.sql.test.ExamplePoint
              <synthetic> <triedcooking>
              "x2"
              <tpt> // tree.tpe=org.apache.spark.sql.test.ExamplePoint
              Typed( // tree.tpe=org.apache.spark.sql.test.ExamplePoint
                Apply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in
class Object, tree.tpe=org.apache.spark.sql.test.ExamplePoint
                  TypeApply( // final def $asInstanceOf[T0 >: ? <: ?](): T0
in class Object, tree.tpe=()org.apache.spark.sql.test.ExamplePoint
                    "x1"."$asInstanceOf" // final def $asInstanceOf[T0 >: ?
<: ?](): T0 in class Object, tree.tpe=[T0 >: ? <: ?]()T0
                    <tpt> // tree.tpe=org.apache.spark.sql.test.ExamplePoint
                  )
                  Nil
                )
                <tpt> // tree.tpe=org.apache.spark.sql.test.ExamplePoint
              )
            )
            Apply( // case def matchEnd4(x:
org.apache.spark.sql.catalyst.util.GenericArrayData):
org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
              "matchEnd4" // case def matchEnd4(x:
org.apache.spark.sql.catalyst.util.GenericArrayData):
org.apache.spark.sql.catalyst.util.GenericArrayData, tree.tpe=(x:
org.apache.spark.sql.catalyst.util.GenericArrayData)org.apache.spark.sql.catalyst.util.GenericArrayData
              Block( //
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
                // 3 statements
                ValDef( // val output: Array[Object]
                  <triedcooking>
                  "output"
                  <tpt> // tree.tpe=Array[Object]
                  Apply( // def <init>(_length: Int): Array[T] in class
Array, tree.tpe=Array[Object]
                    new Array[Object]."<init>" // def <init>(_length: Int):
Array[T] in class Array, tree.tpe=(_length: Int)Array[Object]
                    2
                  )
                )
                Apply( // def update(i: Int,x: T): Unit in class Array,
tree.tpe=Unit
                  "output"."update" // def update(i: Int,x: T): Unit in
class Array, tree.tpe=(i: Int, x: Object)Unit
                  // 2 arguments
                  0
                  Apply( // def box(x: Double): Double in object Double,
tree.tpe=Object
                    "scala"."Double"."box" // def box(x: Double): Double in
object Double, tree.tpe=(x: Double)Double
                    Apply( // val x(): Double in class ExamplePoint,
tree.tpe=Double
                      "x2"."x" // val x(): Double in class ExamplePoint,
tree.tpe=()Double
                      Nil
                    )
                  )
                )
                Apply( // def update(i: Int,x: T): Unit in class Array,
tree.tpe=Unit
                  "output"."update" // def update(i: Int,x: T): Unit in
class Array, tree.tpe=(i: Int, x: Object)Unit
                  // 2 arguments
                  1
                  Apply( // def box(x: Double): Double in object Double,
tree.tpe=Object
                    "scala"."Double"."box" // def box(x: Double): Double in
object Double, tree.tpe=(x: Double)Double
                    Apply( // val y(): Double in class ExamplePoint,
tree.tpe=Double
                      "x2"."y" // val y(): Double in class ExamplePoint,
tree.tpe=()Double
                      Nil
                    )
                  )
                )
                Apply( // def <init>(array: Array[Object]):
org.apache.spark.sql.catalyst.util.GenericArrayData in class
GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
                  new
org.apache.spark.sql.catalyst.util.GenericArrayData."<init>" // def
<init>(array: Array[Object]):
org.apache.spark.sql.catalyst.util.GenericArrayData in class
GenericArrayData, tree.tpe=(array:
Array[Object])org.apache.spark.sql.catalyst.util.GenericArrayData
                  "output" // val output: Array[Object],
tree.tpe=Array[Object]
                )
              )
            )
          )
          Apply( // case def case6():
org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
            "case6" // case def case6():
org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=()org.apache.spark.sql.catalyst.util.GenericArrayData
            Nil
          )
        )
      )
      LabelDef( // case def case6():
org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
        ()
        Apply( // case def matchEnd4(x:
org.apache.spark.sql.catalyst.util.GenericArrayData):
org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
          "matchEnd4" // case def matchEnd4(x:
org.apache.spark.sql.catalyst.util.GenericArrayData):
org.apache.spark.sql.catalyst.util.GenericArrayData, tree.tpe=(x:
org.apache.spark.sql.catalyst.util.GenericArrayData)org.apache.spark.sql.catalyst.util.GenericArrayData
          Throw( // tree.tpe=Nothing
            Apply( // def <init>(obj: Object): MatchError in class
MatchError, tree.tpe=MatchError
              new MatchError."<init>" // def <init>(obj: Object):
MatchError in class MatchError, tree.tpe=(obj: Object)MatchError
              "x1" // case val x1: Object, tree.tpe=Object
            )
          )
        )
      )
      LabelDef( // case def matchEnd4(x:
org.apache.spark.sql.catalyst.util.GenericArrayData):
org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
        "x" // x: org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
        "x" // x: org.apache.spark.sql.catalyst.util.GenericArrayData,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
      )
    )
  )
  DefDef( // override def deserialize(datum: Object):
org.apache.spark.sql.test.ExamplePoint in class ExamplePointUDT
    <method> override <triedcooking>
    "deserialize"
    []
    // 1 parameter list
    ValDef( // datum: Object
      <param> <triedcooking>
      "datum"
      <tpt> // tree.tpe=Object
      <empty>
    )
    <tpt> // tree.tpe=org.apache.spark.sql.test.ExamplePoint
    Block( // tree.tpe=org.apache.spark.sql.test.ExamplePoint
      // 3 statements
      ValDef( // case val x1: Object
        case <synthetic> <triedcooking>
        "x1"
        <tpt> // tree.tpe=Object
        "datum" // datum: Object, tree.tpe=Object
      )
      LabelDef( // case def case5():
org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
        ()
        If( // tree.tpe=org.apache.spark.sql.test.ExamplePoint
          Apply( // final def $isInstanceOf[T0 >: ? <: ?](): Boolean in
class Object, tree.tpe=Boolean
            TypeApply( // final def $isInstanceOf[T0 >: ? <: ?](): Boolean
in class Object, tree.tpe=()Boolean
              "x1"."$isInstanceOf" // final def $isInstanceOf[T0 >: ? <:
?](): Boolean in class Object, tree.tpe=[T0 >: ? <: ?]()Boolean
              <tpt> // tree.tpe=org.apache.spark.sql.catalyst.util.ArrayData
            )
            Nil
          )
          Block( // tree.tpe=org.apache.spark.sql.test.ExamplePoint
            ValDef( // val x2: org.apache.spark.sql.catalyst.util.ArrayData
              <synthetic> <triedcooking>
              "x2"
              <tpt> // tree.tpe=org.apache.spark.sql.catalyst.util.ArrayData
              Typed( //
tree.tpe=org.apache.spark.sql.catalyst.util.ArrayData
                Apply( // final def $asInstanceOf[T0 >: ? <: ?](): T0 in
class Object, tree.tpe=org.apache.spark.sql.catalyst.util.ArrayData
                  TypeApply( // final def $asInstanceOf[T0 >: ? <: ?](): T0
in class Object, tree.tpe=()org.apache.spark.sql.catalyst.util.ArrayData
                    "x1"."$asInstanceOf" // final def $asInstanceOf[T0 >: ?
<: ?](): T0 in class Object, tree.tpe=[T0 >: ? <: ?]()T0
                    <tpt> //
tree.tpe=org.apache.spark.sql.catalyst.util.ArrayData
                  )
                  Nil
                )
                <tpt> //
tree.tpe=org.apache.spark.sql.catalyst.util.ArrayData
              )
            )
            Apply( // case def matchEnd4(x:
org.apache.spark.sql.test.ExamplePoint):
org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
              "matchEnd4" // case def matchEnd4(x:
org.apache.spark.sql.test.ExamplePoint):
org.apache.spark.sql.test.ExamplePoint, tree.tpe=(x:
org.apache.spark.sql.test.ExamplePoint)org.apache.spark.sql.test.ExamplePoint
              Apply( // def <init>(x: Double,y: Double):
org.apache.spark.sql.test.ExamplePoint in class ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
                new org.apache.spark.sql.test.ExamplePoint."<init>" // def
<init>(x: Double,y: Double): org.apache.spark.sql.test.ExamplePoint in
class ExamplePoint, tree.tpe=(x: Double, y:
Double)org.apache.spark.sql.test.ExamplePoint
                // 2 arguments
                Apply( // def getDouble(x$1: Int): Double in trait
SpecializedGetters, tree.tpe=Double
                  "x2"."getDouble" // def getDouble(x$1: Int): Double in
trait SpecializedGetters, tree.tpe=(x$1: Int)Double
                  0
                )
                Apply( // def getDouble(x$1: Int): Double in trait
SpecializedGetters, tree.tpe=Double
                  "x2"."getDouble" // def getDouble(x$1: Int): Double in
trait SpecializedGetters, tree.tpe=(x$1: Int)Double
                  1
                )
              )
            )
          )
          Apply( // case def case6():
org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
            "case6" // case def case6():
org.apache.spark.sql.test.ExamplePoint,
tree.tpe=()org.apache.spark.sql.test.ExamplePoint
            Nil
          )
        )
      )
      LabelDef( // case def case6():
org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
        ()
        Apply( // case def matchEnd4(x:
org.apache.spark.sql.test.ExamplePoint):
org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
          "matchEnd4" // case def matchEnd4(x:
org.apache.spark.sql.test.ExamplePoint):
org.apache.spark.sql.test.ExamplePoint, tree.tpe=(x:
org.apache.spark.sql.test.ExamplePoint)org.apache.spark.sql.test.ExamplePoint
          Throw( // tree.tpe=Nothing
            Apply( // def <init>(obj: Object): MatchError in class
MatchError, tree.tpe=MatchError
              new MatchError."<init>" // def <init>(obj: Object):
MatchError in class MatchError, tree.tpe=(obj: Object)MatchError
              "x1" // case val x1: Object, tree.tpe=Object
            )
          )
        )
      )
      LabelDef( // case def matchEnd4(x:
org.apache.spark.sql.test.ExamplePoint):
org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
        "x" // x: org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
        "x" // x: org.apache.spark.sql.test.ExamplePoint,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
      )
    )
  )
  DefDef( // override def userClass(): Class in class ExamplePointUDT
    <method> override
    "userClass"
    []
    List(Nil)
    <tpt> // tree.tpe=Class
    classOf[org.apache.spark.sql.test.ExamplePoint]
  )
  DefDef( // override private[package spark] def asNullable():
org.apache.spark.sql.test.ExamplePointUDT in class ExamplePointUDT
    <method> override <triedcooking>
    "asNullable"
    []
    List(Nil)
    <tpt> // tree.tpe=org.apache.spark.sql.test.ExamplePointUDT
    This(<empty>)private[package sql] class ExamplePointUDT extends
UserDefinedType in package test,
tree.tpe=org.apache.spark.sql.test.ExamplePointUDT
  )
  DefDef( // override def asNullable(): org.apache.spark.sql.types.DataType
in class ExamplePointUDT
    <method> override <bridge>
    "asNullable"
    []
    List(Nil)
    <tpt> // tree.tpe=org.apache.spark.sql.types.DataType
    Apply( // override private[package spark] def asNullable():
org.apache.spark.sql.test.ExamplePointUDT in class ExamplePointUDT,
tree.tpe=org.apache.spark.sql.test.ExamplePointUDT
      ExamplePointUDT.this."asNullable" // override private[package spark]
def asNullable(): org.apache.spark.sql.test.ExamplePointUDT in class
ExamplePointUDT, tree.tpe=()org.apache.spark.sql.test.ExamplePointUDT
      Nil
    )
  )
  DefDef( // override def asNullable():
org.apache.spark.sql.types.UserDefinedType in class ExamplePointUDT
    <method> override <bridge>
    "asNullable"
    []
    List(Nil)
    <tpt> // tree.tpe=org.apache.spark.sql.types.UserDefinedType
    Apply( // override private[package spark] def asNullable():
org.apache.spark.sql.test.ExamplePointUDT in class ExamplePointUDT,
tree.tpe=org.apache.spark.sql.test.ExamplePointUDT
      ExamplePointUDT.this."asNullable" // override private[package spark]
def asNullable(): org.apache.spark.sql.test.ExamplePointUDT in class
ExamplePointUDT, tree.tpe=()org.apache.spark.sql.test.ExamplePointUDT
      Nil
    )
  )
  DefDef( // override def deserialize(datum: Object): Object in class
ExamplePointUDT
    <method> override <bridge>
    "deserialize"
    []
    // 1 parameter list
    ValDef( // datum: Object
      <param> <triedcooking>
      "datum"
      <tpt> // tree.tpe=Object
      <empty>
    )
    <tpt> // tree.tpe=Object
    Apply( // override def deserialize(datum: Object):
org.apache.spark.sql.test.ExamplePoint in class ExamplePointUDT,
tree.tpe=org.apache.spark.sql.test.ExamplePoint
      ExamplePointUDT.this."deserialize" // override def deserialize(datum:
Object): org.apache.spark.sql.test.ExamplePoint in class ExamplePointUDT,
tree.tpe=(datum: Object)org.apache.spark.sql.test.ExamplePoint
      "datum" // datum: Object, tree.tpe=Object
    )
  )
  DefDef( // override def serialize(obj: Object): Object in class
ExamplePointUDT
    <method> override <bridge>
    "serialize"
    []
    // 1 parameter list
    ValDef( // obj: Object
      <param> <triedcooking>
      "obj"
      <tpt> // tree.tpe=Object
      <empty>
    )
    <tpt> // tree.tpe=Object
    Apply( // override def serialize(obj: Object):
org.apache.spark.sql.catalyst.util.GenericArrayData in class
ExamplePointUDT,
tree.tpe=org.apache.spark.sql.catalyst.util.GenericArrayData
      ExamplePointUDT.this."serialize" // override def serialize(obj:
Object): org.apache.spark.sql.catalyst.util.GenericArrayData in class
ExamplePointUDT, tree.tpe=(obj:
Object)org.apache.spark.sql.catalyst.util.GenericArrayData
      "obj" // obj: Object, tree.tpe=Object
    )
  )
  DefDef( // def <init>(): org.apache.spark.sql.test.ExamplePointUDT in
class ExamplePointUDT
    <method>
    "<init>"
    []
    List(Nil)
    <tpt> // tree.tpe=org.apache.spark.sql.test.ExamplePointUDT
    Block( // tree.tpe=Unit
      Apply( // def <init>(): org.apache.spark.sql.types.UserDefinedType in
class UserDefinedType, tree.tpe=org.apache.spark.sql.types.UserDefinedType
        ExamplePointUDT.super."<init>" // def <init>():
org.apache.spark.sql.types.UserDefinedType in class UserDefinedType,
tree.tpe=()org.apache.spark.sql.types.UserDefinedType
        Nil
      )
      ()
    )
  )
)
== Expanded type of tree ==
*ConstantType(*
*  value = Constant(org.apache.spark.sql.test.ExamplePoint)*
*)*
*uncaught exception during compilation: java.lang.AssertionError*

*Error:scala: Error: assertion failed: List(object package$DebugNode,
object package$DebugNode)*
*java.lang.AssertionError: assertion failed: List(object package$DebugNode,
object package$DebugNode)*
at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
at
scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
at
scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
at
scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
at scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
at xsbt.CachedCompiler0.run(CompilerInterface.scala:126)
at xsbt.CachedCompiler0.run(CompilerInterface.scala:102)
at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
at
sbt.compiler.AggressiveCompile$$anonfun$6$$anonfun$compileScala$1$1$$anonfun$apply$3$$anonfun$apply$1.apply$mcV$sp(AggressiveCompile.scala:106)
at
sbt.compiler.AggressiveCompile$$anonfun$6$$anonfun$compileScala$1$1$$anonfun$apply$3$$anonfun$apply$1.apply(AggressiveCompile.scala:106)
at
sbt.compiler.AggressiveCompile$$anonfun$6$$anonfun$compileScala$1$1$$anonfun$apply$3$$anonfun$apply$1.apply(AggressiveCompile.scala:106)
at
sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:179)
at
sbt.compiler.AggressiveCompile$$anonfun$6$$anonfun$compileScala$1$1$$anonfun$apply$3.apply(AggressiveCompile.scala:105)
at
sbt.compiler.AggressiveCompile$$anonfun$6$$anonfun$compileScala$1$1$$anonfun$apply$3.apply(AggressiveCompile.scala:102)
at scala.Option.foreach(Option.scala:245)
at
sbt.compiler.AggressiveCompile$$anonfun$6$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:102)
at
sbt.compiler.AggressiveCompile$$anonfun$6$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:102)
at scala.Option.foreach(Option.scala:245)
at
sbt.compiler.AggressiveCompile$$anonfun$6.compileScala$1(AggressiveCompile.scala:102)
at
sbt.compiler.AggressiveCompile$$anonfun$6.apply(AggressiveCompile.scala:151)
at
sbt.compiler.AggressiveCompile$$anonfun$6.apply(AggressiveCompile.scala:89)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:40)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:38)
at sbt.inc.IncrementalCommon.cycle(Incremental.scala:103)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:39)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:69)
at sbt.inc.Incremental$.compile(Incremental.scala:38)
at sbt.inc.IncrementalCompile$.apply(Compile.scala:28)
at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:170)
at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:73)
at
org.jetbrains.jps.incremental.scala.local.SbtCompiler.compile(SbtCompiler.scala:66)
at
org.jetbrains.jps.incremental.scala.local.LocalServer.compile(LocalServer.scala:26)
at org.jetbrains.jps.incremental.scala.remote.Main$.make(Main.scala:62)
at org.jetbrains.jps.incremental.scala.remote.Main$.nailMain(Main.scala:20)
at org.jetbrains.jps.incremental.scala.remote.Main.nailMain(Main.scala)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319)

I just highlighted some error message that I think important as *bold and
red.*

This really bothered me for several days, I don't know how to get through.
Any suggestions? Thanks.

Mime
View raw message