spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-25001) Fix build miscellaneous warnings
Date Thu, 02 Aug 2018 22:01:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-25001?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Apache Spark reassigned SPARK-25001:
------------------------------------

    Assignee:     (was: Apache Spark)

> Fix build miscellaneous warnings
> --------------------------------
>
>                 Key: SPARK-25001
>                 URL: https://issues.apache.org/jira/browse/SPARK-25001
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 2.4.0
>            Reporter: Hyukjin Kwon
>            Priority: Major
>
> There are many warnings in the current build (for instance see https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7/4734/console).
> {code}
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/kvstore/src/main/java/org/apache/spark/util/kvstore/LevelDB.java:237:
warning: [rawtypes] found raw type: LevelDBIterator
> [warn]   void closeIterator(LevelDBIterator it) throws IOException {
> [warn]                      ^
> [warn]   missing type arguments for generic class LevelDBIterator<T>
> [warn]   where T is a type-variable:
> [warn]     T extends Object declared in class LevelDBIterator
> [warn] 1 warning
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/server/TransportServer.java:151:
warning: [deprecation] group() in AbstractBootstrap has been deprecated
> [warn]     if (bootstrap != null && bootstrap.group() != null) {
> [warn]                                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/server/TransportServer.java:152:
warning: [deprecation] group() in AbstractBootstrap has been deprecated
> [warn]       bootstrap.group().shutdownGracefully();
> [warn]                ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/server/TransportServer.java:154:
warning: [deprecation] childGroup() in ServerBootstrap has been deprecated
> [warn]     if (bootstrap != null && bootstrap.childGroup() != null) {
> [warn]                                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/server/TransportServer.java:155:
warning: [deprecation] childGroup() in ServerBootstrap has been deprecated
> [warn]       bootstrap.childGroup().shutdownGracefully();
> [warn]                ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/util/NettyUtils.java:112:
warning: [deprecation] PooledByteBufAllocator(boolean,int,int,int,int,int,int,int) in PooledByteBufAllocator
has been deprecated
> [warn]     return new PooledByteBufAllocator(
> [warn]            ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/client/TransportClient.java:321:
warning: [rawtypes] found raw type: Future
> [warn]     public void operationComplete(Future future) throws Exception {
> [warn]                                   ^
> [warn]   missing type arguments for generic class Future<V>
> [warn]   where V is a type-variable:
> [warn]     V extends Object declared in interface Future
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/client/TransportResponseHandler.java:215:
warning: [rawtypes] found raw type: StreamInterceptor
> [warn]           StreamInterceptor interceptor = new StreamInterceptor(this, resp.streamId,
resp.byteCount,
> [warn]           ^
> [warn]   missing type arguments for generic class StreamInterceptor<T>
> [warn]   where T is a type-variable:
> [warn]     T extends Message declared in class StreamInterceptor
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/client/TransportResponseHandler.java:215:
warning: [rawtypes] found raw type: StreamInterceptor
> [warn]           StreamInterceptor interceptor = new StreamInterceptor(this, resp.streamId,
resp.byteCount,
> [warn]                                               ^
> [warn]   missing type arguments for generic class StreamInterceptor<T>
> [warn]   where T is a type-variable:
> [warn]     T extends Message declared in class StreamInterceptor
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/client/TransportResponseHandler.java:215:
warning: [unchecked] unchecked call to StreamInterceptor(MessageHandler<T>,String,long,StreamCallback)
as a member of the raw type StreamInterceptor
> [warn]           StreamInterceptor interceptor = new StreamInterceptor(this, resp.streamId,
resp.byteCount,
> [warn]                                           ^
> [warn]   where T is a type-variable:
> [warn]     T extends Message declared in class StreamInterceptor
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java:255:
warning: [rawtypes] found raw type: StreamInterceptor
> [warn]         StreamInterceptor interceptor = new StreamInterceptor(this, wrappedCallback.getID(),
> [warn]         ^
> [warn]   missing type arguments for generic class StreamInterceptor<T>
> [warn]   where T is a type-variable:
> [warn]     T extends Message declared in class StreamInterceptor
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java:255:
warning: [rawtypes] found raw type: StreamInterceptor
> [warn]         StreamInterceptor interceptor = new StreamInterceptor(this, wrappedCallback.getID(),
> [warn]                                             ^
> [warn]   missing type arguments for generic class StreamInterceptor<T>
> [warn]   where T is a type-variable:
> [warn]     T extends Message declared in class StreamInterceptor
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/server/TransportRequestHandler.java:255:
warning: [unchecked] unchecked call to StreamInterceptor(MessageHandler<T>,String,long,StreamCallback)
as a member of the raw type StreamInterceptor
> [warn]         StreamInterceptor interceptor = new StreamInterceptor(this, wrappedCallback.getID(),
> [warn]                                         ^
> [warn]   where T is a type-variable:
> [warn]     T extends Message declared in class StreamInterceptor
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/crypto/TransportCipher.java:270:
warning: [deprecation] transfered() in FileRegion has been deprecated
> [warn]         region.transferTo(byteRawChannel, region.transfered());
> [warn]                                                 ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/main/java/org/apache/spark/network/sasl/SaslEncryption.java:304:
warning: [deprecation] transfered() in FileRegion has been deprecated
> [warn]         region.transferTo(byteChannel, region.transfered());
> [warn]                                              ^
> [warn] 14 warnings
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/test/java/org/apache/spark/network/ProtocolSuite.java:119:
warning: [deprecation] transfered() in FileRegion has been deprecated
> [warn]       while (in.transfered() < in.count()) {
> [warn]                ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/network-common/src/test/java/org/apache/spark/network/ProtocolSuite.java:120:
warning: [deprecation] transfered() in FileRegion has been deprecated
> [warn]         in.transferTo(channel, in.transfered());
> [warn]                                  ^
> [warn] 2 warnings
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/unsafe/src/test/java/org/apache/spark/unsafe/hash/Murmur3_x86_32Suite.java:80:
warning: [static] static method should be qualified by type name, Murmur3_x86_32, instead
of by an expression
> [warn]     Assert.assertEquals(-300363099, hasher.hashUnsafeWords(bytes, offset, 16,
42));
> [warn]                                           ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/unsafe/src/test/java/org/apache/spark/unsafe/hash/Murmur3_x86_32Suite.java:84:
warning: [static] static method should be qualified by type name, Murmur3_x86_32, instead
of by an expression
> [warn]     Assert.assertEquals(-1210324667, hasher.hashUnsafeWords(bytes, offset, 16,
42));
> [warn]                                            ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/common/unsafe/src/test/java/org/apache/spark/unsafe/hash/Murmur3_x86_32Suite.java:88:
warning: [static] static method should be qualified by type name, Murmur3_x86_32, instead
of by an expression
> [warn]     Assert.assertEquals(-634919701, hasher.hashUnsafeWords(bytes, offset, 16,
42));
> [warn]                                           ^
> [warn] 3 warnings
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/launcher/src/main/java/org/apache/spark/launcher/AbstractLauncher.java:31:
warning: [rawtypes] found raw type: AbstractLauncher
> [warn] public abstract class AbstractLauncher<T extends AbstractLauncher> {
> [warn]                                                  ^
> [warn]   missing type arguments for generic class AbstractLauncher<T>
> [warn]   where T is a type-variable:
> [warn]     T extends AbstractLauncher declared in class AbstractLauncher
> [warn] 1 warning
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:99:
method group in class AbstractBootstrap is deprecated: see corresponding Javadoc for more
information.
> [warn]     if (bootstrap != null && bootstrap.group() != null) {
> [warn]                                        ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:100:
method group in class AbstractBootstrap is deprecated: see corresponding Javadoc for more
information.
> [warn]       bootstrap.group().shutdownGracefully()
> [warn]                 ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:102:
method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more
information.
> [warn]     if (bootstrap != null && bootstrap.childGroup() != null) {
> [warn]                                        ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/main/scala/org/apache/spark/api/r/RBackend.scala:103:
method childGroup in class ServerBootstrap is deprecated: see corresponding Javadoc for more
information.
> [warn]       bootstrap.childGroup().shutdownGracefully()
> [warn]                 ^
> [warn] 6 warnings found
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/util/ClosureCleanerSuite.scala:151:
reflective access of structural type member method getData should be enabled
> [warn] by making the implicit value scala.language.reflectiveCalls visible.
> [warn] This can be achieved by adding the import clause 'import scala.language.reflectiveCalls'
> [warn] or by setting the compiler option -language:reflectiveCalls.
> [warn] See the Scaladoc for value scala.language.reflectiveCalls for a discussion
> [warn] why the feature should be explicitly enabled.
> [warn]       val rdd = sc.parallelize(1 to 1).map(concreteObject.getData)
> [warn]                                                           ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/util/ClosureCleanerSuite.scala:175:
reflective access of structural type member value innerObject2 should be enabled
> [warn] by making the implicit value scala.language.reflectiveCalls visible.
> [warn]       val rdd = sc.parallelize(1 to 1).map(concreteObject.innerObject2.getData)
> [warn]                                                           ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/util/ClosureCleanerSuite.scala:175:
reflective access of structural type member method getData should be enabled
> [warn] by making the implicit value scala.language.reflectiveCalls visible.
> [warn]       val rdd = sc.parallelize(1 to 1).map(concreteObject.innerObject2.getData)
> [warn]                                                                        ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/LocalSparkContext.scala:32:
constructor Slf4JLoggerFactory in class Slf4JLoggerFactory is deprecated: see corresponding
Javadoc for more information.
> [warn]     InternalLoggerFactory.setDefaultFactory(new Slf4JLoggerFactory())
> [warn]                                             ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:218:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]         assert(wrapper.stageAttemptId === stages.head.attemptId)
> [warn]                                                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:261:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]       stageAttemptId = stages.head.attemptId))
> [warn]                                    ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:287:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]       stageAttemptId = stages.head.attemptId))
> [warn]                                    ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:471:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]       stageAttemptId = stages.last.attemptId))
> [warn]                                    ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:966:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]     listener.onTaskStart(SparkListenerTaskStart(dropped.stageId, dropped.attemptId,
task))
> [warn]                                                                          ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:972:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]     listener.onTaskEnd(SparkListenerTaskEnd(dropped.stageId, dropped.attemptId,
> [warn]                                                                      ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:976:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]       .taskSummary(dropped.stageId, dropped.attemptId, Array(0.25d, 0.50d, 0.75d))
> [warn]                                             ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:1146:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]       SparkListenerTaskEnd(stage1.stageId, stage1.attemptId, "taskType", Success,
tasks(1), null))
> [warn]                                                   ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/status/AppStatusListenerSuite.scala:1150:
value attemptId in class StageInfo is deprecated: Use attemptNumber instead
> [warn]       SparkListenerTaskEnd(stage1.stageId, stage1.attemptId, "taskType", Success,
tasks(0), null))
> [warn]                                                   ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/storage/DiskStoreSuite.scala:197:
method transfered in trait FileRegion is deprecated: see corresponding Javadoc for more information.
> [warn]     while (region.transfered() < region.count()) {
> [warn]                   ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/core/src/test/scala/org/apache/spark/storage/DiskStoreSuite.scala:198:
method transfered in trait FileRegion is deprecated: see corresponding Javadoc for more information.
> [warn]       region.transferTo(byteChannel, region.transfered())
> [warn]                                             ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala:534:
abstract type T is unchecked since it is eliminated by erasure
> [warn]       assert(partitioning.isInstanceOf[T])
> [warn]                                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala:534:
abstract type T is unchecked since it is eliminated by erasure
> [warn]       assert(partitioning.isInstanceOf[T])
> [warn]             ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ObjectExpressionsSuite.scala:323:
inferred existential type Option[Class[_$1]]( forSome { type _$1 }), which cannot be expressed
by wildcards,  should be enabled
> [warn] by making the implicit value scala.language.existentials visible.
> [warn] This can be achieved by adding the import clause 'import scala.language.existentials'
> [warn] or by setting the compiler option -language:existentials.
> [warn] See the Scaladoc for value scala.language.existentials for a discussion
> [warn] why the feature should be explicitly enabled.
> [warn]       val optClass = Option(collectionCls)
> [warn]                            ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:368:
method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for
more information.
> [warn]         ParquetFileReader.readFooter(sharedConf, filePath, SKIP_ROW_GROUPS).getFileMetaData
> [warn]                           ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:545:
method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for
more information.
> [warn]             ParquetFileReader.readFooter(
> [warn]                               ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java:105:
warning: [deprecation] readFooter(Configuration,Path,MetadataFilter) in ParquetFileReader
has been deprecated
> [warn]       footer = readFooter(configuration, file, range(split.getStart(), split.getEnd()));
> [warn]                ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java:108:
warning: [deprecation] filterRowGroups(Filter,List<BlockMetaData>,MessageType) in RowGroupFilter
has been deprecated
> [warn]       blocks = filterRowGroups(filter, footer.getBlocks(), fileSchema);
> [warn]                ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java:111:
warning: [deprecation] readFooter(Configuration,Path,MetadataFilter) in ParquetFileReader
has been deprecated
> [warn]       footer = readFooter(configuration, file, NO_FILTER);
> [warn]                ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java:147:
warning: [deprecation] ParquetFileReader(Configuration,FileMetaData,Path,List<BlockMetaData>,List<ColumnDescriptor>)
in ParquetFileReader has been deprecated
> [warn]     this.reader = new ParquetFileReader(
> [warn]                   ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java:203:
warning: [deprecation] readFooter(Configuration,Path,MetadataFilter) in ParquetFileReader
has been deprecated
> [warn]     ParquetMetadata footer = readFooter(config, file, range(0, length));
> [warn]                              ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java:226:
warning: [deprecation] ParquetFileReader(Configuration,FileMetaData,Path,List<BlockMetaData>,List<ColumnDescriptor>)
in ParquetFileReader has been deprecated
> [warn]     this.reader = new ParquetFileReader(
> [warn]                   ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:178:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]             (descriptor.getType() == PrimitiveType.PrimitiveTypeName.INT32 ||
> [warn]                        ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:179:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]             (descriptor.getType() == PrimitiveType.PrimitiveTypeName.INT64  &&
> [warn]                        ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:181:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]             descriptor.getType() == PrimitiveType.PrimitiveTypeName.FLOAT ||
> [warn]                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:182:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]             descriptor.getType() == PrimitiveType.PrimitiveTypeName.DOUBLE ||
> [warn]                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:183:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]             descriptor.getType() == PrimitiveType.PrimitiveTypeName.BINARY)))
{
> [warn]                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:198:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]         switch (descriptor.getType()) {
> [warn]                           ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:221:
warning: [deprecation] getTypeLength() in ColumnDescriptor has been deprecated
> [warn]             readFixedLenByteArrayBatch(rowId, num, column, descriptor.getTypeLength());
> [warn]                                                                      ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:224:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]             throw new IOException("Unsupported type: " + descriptor.getType());
> [warn]                                                                    ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:246:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]       descriptor.getType().toString(),
> [warn]                 ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:258:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]     switch (descriptor.getType()) {
> [warn]                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedColumnReader.java:384:
warning: [deprecation] getType() in ColumnDescriptor has been deprecated
> [warn]         throw new UnsupportedOperationException("Unsupported type: " + descriptor.getType());
> [warn]                                                                              
   ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/vectorized/ArrowColumnVector.java:458:
warning: [static] static variable should be qualified by type name, BaseRepeatedValueVector,
instead of by an expression
> [warn]       int index = rowId * accessor.OFFSET_WIDTH;
> [warn]                                   ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/main/java/org/apache/spark/sql/vectorized/ArrowColumnVector.java:460:
warning: [static] static variable should be qualified by type name, BaseRepeatedValueVector,
instead of by an expression
> [warn]       int end = offsets.getInt(index + accessor.OFFSET_WIDTH);
> [warn]                                                ^
> [warn] 27 warnings
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/BenchmarkQueryTest.scala:57:
a pure expression does nothing in statement position; you may be omitting necessary parentheses
> [warn]       case s => s
> [warn]                 ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetInteroperabilitySuite.scala:182:
inferred existential type org.apache.parquet.column.statistics.Statistics[?0]( forSome { type
?0 <: Comparable[?0] }), which cannot be expressed by wildcards,  should be enabled
> [warn] by making the implicit value scala.language.existentials visible.
> [warn] This can be achieved by adding the import clause 'import scala.language.existentials'
> [warn] or by setting the compiler option -language:existentials.
> [warn] See the Scaladoc for value scala.language.existentials for a discussion
> [warn] why the feature should be explicitly enabled.
> [warn]                 val columnStats = oneBlockColumnMeta.getStatistics
> [warn]                                                      ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/sources/ForeachBatchSinkSuite.scala:146:
implicit conversion method conv should be enabled
> [warn] by making the implicit value scala.language.implicitConversions visible.
> [warn] This can be achieved by adding the import clause 'import scala.language.implicitConversions'
> [warn] or by setting the compiler option -language:implicitConversions.
> [warn] See the Scaladoc for value scala.language.implicitConversions for a discussion
> [warn] why the feature should be explicitly enabled.
> [warn]     implicit def conv(x: (Int, Long)): KV = KV(x._1, x._2)
> [warn]                  ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/streaming/continuous/shuffle/ContinuousShuffleSuite.scala:48:
implicit conversion method unsafeRow should be enabled
> [warn] by making the implicit value scala.language.implicitConversions visible.
> [warn]   private implicit def unsafeRow(value: Int) = {
> [warn]                        ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetInteroperabilitySuite.scala:176:
method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for
more information.
> [warn]                   ParquetFileReader.readFooter(hadoopConf, part.getPath, NO_FILTER)
> [warn]                                     ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetInteroperabilitySuite.scala:178:
method getType in class ColumnDescriptor is deprecated: see corresponding Javadoc for more
information.
> [warn]                 assert(oneFooter.getFileMetaData.getSchema.getColumns.get(0).getType()
===
> [warn]                                                                              ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:154:
method readAllFootersInParallel in object ParquetFileReader is deprecated: see corresponding
Javadoc for more information.
> [warn]     ParquetFileReader.readAllFootersInParallel(configuration, fs.getFileStatus(path)).asScala.toSeq
> [warn]                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetTest.scala:158:
method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for
more information.
> [warn]     ParquetFileReader.readFooter(
> [warn]                       ^
> [warn] Pruning sources from previous analysis, due to incompatible CompileSetup.
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/mllib/src/test/scala/org/apache/spark/ml/recommendation/ALSSuite.scala:597:
match may not be exhaustive.
> [warn] It would fail on the following inputs: None, Some((x: Tuple2[?, ?] forSome x not
in (?, ?)))
> [warn]     val df = dfs.find {
> [warn]                       ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala:2117:
method readFooter in object ParquetFileReader is deprecated: see corresponding Javadoc for
more information.
> [warn]         val footer = ParquetFileReader.readFooter(
> [warn]                                        ^
> [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7/sql/hive/src/test/java/org/apache/spark/sql/hive/test/Complex.java:679:
warning: [cast] redundant cast to Complex
> [warn]     Complex typedOther = (Complex)other;
> [warn]                          ^
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message