spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "inred (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-18263) Configuring spark.kryo.registrator programmatically doesn't take effect
Date Mon, 07 Nov 2016 05:52:58 GMT

    [ https://issues.apache.org/jira/browse/SPARK-18263?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15643210#comment-15643210
] 

inred commented on SPARK-18263:
-------------------------------

"C:\Program Files\Java\jdk1.8.0_92\bin\java" -Didea.launcher.port=7532 "-Didea.launcher.bin.path=C:\Program
Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.2.5\bin" -Dfile.encoding=UTF-8
-classpath "C:\Program Files\Java\jdk1.8.0_92\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\deploy.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\cldrdata.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\jaccess.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\localedata.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunec.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunmscapi.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\zipfs.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\jce.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\jfxswt.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\management-agent.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_92\jre\lib\resources.jar;C:\Program
Files\Java\jdk1.8.0_92\jre\lib\rt.jar;E:\app\wc\target\scala-2.11\classes;C:\Users\feiwu\.ivy2\cache\org.scala-lang\scala-library\jars\scala-library-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xz-1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jta-1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jpam-1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\guice-3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\ivy-2.4.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\lz4-1.3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\oro-2.0.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\ST4-4.0.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\avro-1.7.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\core-1.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\gson-2.2.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\mail-1.4.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\mx4j-3.0.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\snappy-0.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\antlr-2.7.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\opencsv-2.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\py4j-0.10.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xmlenc-0.52.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\base64-2.3.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\guava-14.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\janino-2.7.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jets3t-0.9.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jetty-6.1.26.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jline-2.12.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jsr305-1.3.9.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\log4j-1.2.17.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\minlog-1.3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\pyrolite-4.9.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stream-2.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jdo-api-3.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json-20090211.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\objenesis-2.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\paranamer-2.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scalap-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\avro-ipc-1.7.7.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-io-2.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\httpcore-4.4.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\JavaEWAH-0.3.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.inject-1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jaxb-api-2.2.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\libfb303-0.9.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stax-api-1.0-2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stax-api-1.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\aopalliance-1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-cli-1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-net-2.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\derby-10.12.1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\joda-time-2.9.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jodd-core-3.5.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\libthrift-0.9.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\super-csv-2.2.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\zookeeper-3.4.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\activation-1.1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\chill-java-0.8.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\chill_2.11-0.8.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-dbcp-1.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-lang-2.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\httpclient-4.5.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javolution-5.5.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\slf4j-api-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spire_2.11-0.7.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xercesImpl-2.9.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\antlr-runtime-3.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\guice-servlet-3.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-auth-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-hdfs-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hk2-api-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\htrace-core-3.0.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-xc-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jetty-util-6.1.26.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jtransforms-2.4.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\kryo-shaded-3.0.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-jvm-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\netty-3.8.0.Final.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\pmml-model-1.2.15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\api-util-1.0.0-M20.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\breeze_2.11-0.11.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-codec-1.10.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-pool-1.5.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\compress-lzf-1.0.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-core-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\leveldbjni-all-1.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-core-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-json-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\pmml-schema-1.2.15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\bcprov-jdk15on-1.51.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-lang3-3.3.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-math3-3.4.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-client-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hk2-utils-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\java-xmlbuilder-1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javassist-3.18.1-GA.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-guava-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jul-to-slf4j-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\protobuf-java-2.5.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\snappy-java-1.1.2.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\antlr4-runtime-4.5.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\bonecp-0.8.0.RELEASE.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-digester-1.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\curator-client-2.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-jaxrs-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-client-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-common-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-server-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-column-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-common-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-hadoop-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\RoaringBitmap-0.5.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-library-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-reflect-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-xml_2.11-1.0.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\slf4j-log4j12-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-sql_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\stringtemplate-3.2.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-logging-1.1.3.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\curator-recipes-2.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-api-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-cli-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hk2-locator-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.ws.rs-api-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jcl-over-slf4j-1.7.16.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-jackson-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-compiler-2.11.8.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-core_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-hive_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-repl_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-tags_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-yarn_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\xbean-asm5-shaded-4.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\api-asn1-api-1.0.0-M20.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-compiler-2.7.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-compress-1.4.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-httpclient-3.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-exec-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-jdbc-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-databind-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.inject-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json4s-ast_2.11-3.2.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\metrics-graphite-3.1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\netty-all-4.0.29.Final.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-encoding-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-mllib_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\apacheds-i18n-2.0.0-M15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\arpack_combined_all-0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-beanutils-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\curator-framework-2.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\datanucleus-core-3.2.10.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\datanucleus-rdbms-3.2.9.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-core-asl-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.servlet-api-3.1.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json4s-core_2.11-3.2.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-generator-1.7.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-graphx_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-sketch_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-unsafe_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spire-macros_2.11-0.7.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\univocity-parsers-2.1.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-annotations-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-client-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\javax.annotation-api-1.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-media-jaxb-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\avro-mapred-1.7.7-hadoop2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\breeze-macros_2.11-0.11.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-collections-3.2.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-configuration-1.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\datanucleus-api-jdo-3.2.6.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-beeline-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-annotations-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-mapper-asl-1.9.13.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-catalyst_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-launcher_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\apache-log4j-extras-1.2.17.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\eigenbase-properties-1.1.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\json4s-jackson_2.11-3.2.11.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-streaming_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\validation-api-1.1.0.Final.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hive-metastore-1.2.1.spark2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\osgi-resource-locator-1.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-hadoop-bundle-1.6.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\commons-beanutils-core-1.8.0.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\mesos-0.21.1-shaded-protobuf.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-mllib-local_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\calcite-core-1.2.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-module-paranamer-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\calcite-linq4j-1.2.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-server-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jackson-module-scala_2.11-2.6.5.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-container-servlet-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\parquet-format-2.3.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-network-common_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\aopalliance-repackaged-2.4.0-b34.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\calcite-avatica-1.2.0-incubating.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-network-shuffle_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\apacheds-kerberos-codec-2.0.0-M15.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-app-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-core-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-yarn-server-web-proxy-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\spark-hive-thriftserver_2.11-2.0.1.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\scala-parser-combinators_2.11-1.0.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-common-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\jersey-container-servlet-core-2.22.2.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-shuffle-2.6.4.jar;D:\Documents\download\spark-2.0.1-bin-hadoop2.6\jars\hadoop-mapreduce-client-jobclient-2.6.4.jar;D:\Documents\download\adam-distribution-spark2_2.11-0.20.0\repo\adam_2.11-0.20.0.jar;C:\Program
Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.2.5\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain
org.apache.spark.examples.SparkPi --conf spark.kryo.registrator=org.bdgenomics.adam.serialization.ADAMKryoRegistrator
--conf spark.serializer=org.apache.spark.serializer.KryoSerializer --conf spark.master=local[*]
--class org.apache.spark.examples.SparkPi
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/D:/Documents/download/spark-2.0.1-bin-hadoop2.6/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/Documents/download/adam-distribution-spark2_2.11-0.20.0/repo/adam_2.11-0.20.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2016-11-07 13:51:11 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
2016-11-07 13:51:13 WARN  SparkContext:66 - Use an existing SparkContext, some configuration
may not take effect.

> Configuring spark.kryo.registrator programmatically doesn't take effect
> -----------------------------------------------------------------------
>
>                 Key: SPARK-18263
>                 URL: https://issues.apache.org/jira/browse/SPARK-18263
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.1
>         Environment: spark-2.0.1-bin-hadoop2.6
> scala-2.11.8
>            Reporter: inred
>
> it run ok with spark-shell --conf spark.serializer=org.apache.spark.serializer.KryoSerializer
\
>     --conf spark.kryo.registrator=org.bdgenomics.adam.serialization.ADAMKryoRegistrator
\
> but in IDE
>  val spark = SparkSession.builder.master("local[*]").appName("Anno BDG").getOrCreate()
> spark.conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
> spark.conf.set("spark.kryo.registrator", "org.bdgenomics.adam.serialization.ADAMKryoRegistrator")
> it reports the following error:
> java.io.NotSerializableException: org.bdgenomics.formats.avro.AlignmentRecord
> Serialization stack:
> object not serializable (class: org.bdgenomics.formats.avro.AlignmentRecord, value: {"readInFragment":
0, "contigName": "chr10", "start": 61758687, "oldPosition": null, "end": 61758727, "mapq":
25, "readName": "NB501244AR:119:HJY3WBGXY:2:11112:6137:19359", "sequence": "AAAATACTGAGACTTATCAGAATTTCAGGCTAAAGCAACC",
"qual": "AAAAAAEEEEEAEEEEEEEEEEEEEEEEEEEEEEEEEEEE", "cigar": "40M", "oldCigar": null, "basesTrimmedFromStart":
0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true,
"mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand":
false, "mateNegativeStrand": false, "primaryAlignment": true, "secondaryAlignment": false,
"supplementaryAlignment": false, "mismatchingPositions": "40", "origQual": null, "attributes":
"XT:A:U\tXO:i:0\tXM:i:0\tNM:i:0\tXG:i:0\tX1:i:0\tX0:i:1", "recordGroupName": null, "recordGroupSample":
null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContigName": null, "inferredInsertSize":
null})
> at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
> at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
> at org.apache.spark.serializer.SerializationStream.writeValue(Serializer.scala:135)
> at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:185)
> at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:150)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 2016-11-04 10:30:56 ERROR TaskSetManager:70 - Task 0.0 in stage 2.0 (TID 9) had a not
serializable result: org.bdgenomics.formats.avro.AlignmentRecord
> Serialization stack:
> object not serializable (class: org.bdgenomics.formats.avro.AlignmentRecord, value: {"readInFragment":
0, "contigName": "chr1", "start": 10001, "oldPosition": null, "end": 10041, "mapq": 0, "readName":
"NB501244AR:119:HJY3WBGXY:3:11508:7857:8792", "sequence": "AACCCTAACCCTAACCCTAACCCTAACCCTAACCCTAACC",
"qual": "///E////6E////EEAEEE/EEEEEEEEEEEEAEAAA/A", "cigar": "40M", "oldCigar": null, "basesTrimmedFromStart":
0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true,
"mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand":
true, "mateNegativeStrand": false, "primaryAlignment": true, "secondaryAlignment": false,
"supplementaryAlignment": false, "mismatchingPositions": "40", "origQual": null, "attributes":
"XT:A:R\tXO:i:0\tXM:i:0\tNM:i:0\tXG:i:0\tX0:i:594", "recordGroupName": null, "recordGroupSample":
null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContigName": null, "inferredInsertSize":
null}); not retrying
> 2016-11-04 10:30:56 ERROR TaskSetManager:70 - Task 4.0 in stage 2.0 (TID 13) had a not
serializable result: org.bdgenomics.formats.avro.AlignmentRecord
> Serialization stack:
> object not serializable (class: org.bdgenomics.formats.avro.AlignmentRecord, value: {"readInFragment":
0, "contigName": "chr10", "start": 61758687, "oldPosition": null, "end": 61758727, "mapq":
25, "readName": "NB501244AR:119:HJY3WBGXY:2:11112:6137:19359", "sequence": "AAAATACTGAGACTTATCAGAATTTCAGGCTAAAGCAACC",
"qual": "AAAAAAEEEEEAEEEEEEEEEEEEEEEEEEEEEEEEEEEE", "cigar": "40M", "oldCigar": null, "basesTrimmedFromStart":
0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true,
"mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand":
false, "mateNegativeStrand": false, "primaryAlignment": true, "secondaryAlignment": false,
"supplementaryAlignment": false, "mismatchingPositions": "40", "origQual": null, "attributes":
"XT:A:U\tXO:i:0\tXM:i:0\tNM:i:0\tXG:i:0\tX1:i:0\tX0:i:1", "recordGroupName": null, "recordGroupSample":
null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContigName": null, "inferredInsertSize":
null}); not retrying
> 2016-11-04 10:30:56 ERROR TaskSetManager:70 - Task 3.0 in stage 2.0 (TID 12) had a not
serializable result: org.bdgenomics.formats.avro.AlignmentRecord
> Serialization stack:
> object not serializable (class: org.bdgenomics.formats.avro.AlignmentRecord, value: {"readInFragment":
0, "contigName": "chr7", "start": 68163823, "oldPosition": null, "end": 68163863, "mapq":
0, "readName": "NB501244AR:119:HJY3WBGXY:4:21602:16293:18064", "sequence": "TGTGAGGGTGTTGCCCAAAAGAGATTAACATTTGAGTCAG",
"qual": "AAAAAEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE", "cigar": "40M", "oldCigar": null, "basesTrimmedFromStart":
0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true,
"mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand":
false, "mateNegativeStrand": false, "primaryAlignment": true, "secondaryAlignment": false,
"supplementaryAlignment": false, "mismatchingPositions": "40", "origQual": null, "attributes":
"XT:A:R\tXO:i:0\tXM:i:0\tNM:i:0\tXG:i:0\tXA:Z:chr3,-84617448,40M,0;\tX1:i:0\tX0:i:2", "recordGroupName":
null, "recordGroupSample": null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContigName":
null, "inferredInsertSize": null}); not retrying
> 2016-11-04 10:30:56 ERROR TaskSetManager:70 - Task 2.0 in stage 2.0 (TID 11) had a not
serializable result: org.bdgenomics.formats.avro.AlignmentRecord
> Serialization stack:
> object not serializable (class: org.bdgenomics.formats.avro.AlignmentRecord, value: {"readInFragment":
0, "contigName": "chr4", "start": 181076278, "oldPosition": null, "end": 181076318, "mapq":
25, "readName": "NB501244AR:119:HJY3WBGXY:2:23302:26459:8305", "sequence": "CACTGTGTTTTACTTCTATTTTAAAAAACCTGAAGGCTAT",
"qual": "EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEAAAAA", "cigar": "40M", "oldCigar": null, "basesTrimmedFromStart":
0, "basesTrimmedFromEnd": 0, "readPaired": false, "properPair": false, "readMapped": true,
"mateMapped": false, "failedVendorQualityChecks": false, "duplicateRead": false, "readNegativeStrand":
true, "mateNegativeStrand": false, "primaryAlignment": true, "secondaryAlignment": false,
"supplementaryAlignment": false, "mismatchingPositions": "40", "origQual": null, "attributes":
"XT:A:U\tXO:i:0\tXM:i:0\tNM:i:0\tXG:i:0\tX1:i:0\tX0:i:1", "recordGroupName": null, "recordGroupSample":
null, "mateAlignmentStart": null, "mateAlignmentEnd": null, "mateContigName": null, "inferredInsertSize":
null}); not retrying
> Process finished with exit code 1



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message