spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-20840) Misleading spurious errors when there are Javadoc (Unidoc) breaks
Date Mon, 22 May 2017 14:41:04 GMT

     [ https://issues.apache.org/jira/browse/SPARK-20840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Hyukjin Kwon updated SPARK-20840:
---------------------------------
    Description: 
Currently, when there are Javadoc breaks, this seems printing warnings as errors.

For example, the actual errors were as below in https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77070/consoleFull

{code}
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:4:
error: reference not found
[error]  * than both {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD} and
[error]                     ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:5:
error: reference not found
[error]  * {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD_BY_TIMES_AVERAGE} * averageSize.
It stores the
[error]           ^
{code}

but it also prints many errors from generated Java codes as below:

{code}
[info] Constructing Javadoc information...
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[error]   public   BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus,
org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient, org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
[error]                                                                                  
                                                                 ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[error]   public   BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient)  { throw new RuntimeException(); }
[error]                                                                                  
          ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:133:
error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[error]   private  org.apache.spark.internal.config.ConfigReader reader ()  { throw new RuntimeException();
}
[error]                                            ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:138:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
              ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:139:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
              ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:187:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
                       ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:188:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
                       ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:208:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[error]     org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?>
entry)  { throw new RuntimeException(); }
[error]                                                               
...
{code}

These errors are actually warnings in a successful build without Javadoc breaks as below -
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7/2908/consoleFull

{code}
[info] Constructing Javadoc information...
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[warn]   public   BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus,
org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient, org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
[warn]                                                                                   
                                                                ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[warn]   public   BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient)  { throw new RuntimeException(); }
[warn]                                                                                   
         ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:133:
error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[warn]   private  org.apache.spark.internal.config.ConfigReader reader ()  { throw new RuntimeException();
}
[warn]                                            ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:138:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
             ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:139:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
             ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:187:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
                      ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:188:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
                      ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:208:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[warn]     org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?>
entry)  { throw new RuntimeException(); }
[warn]
...    
{code}

These look warnings not errors in {{javadoc}} but when we introduce a Javadoc break but it
seems sbt produces other warnings as errors when generating javadoc.

For example, with the Java code, {{A.java}}, below:

{code}
/**
* Hi
*/
public class A extends B {
}
{code}

if we run {{javadoc}}

{code}
javadoc A.java
{code}

it produces a warning because it does not find B symbol. It seems still generating the documenation
fine.

{code}
Loading source file A.java...
Constructing Javadoc information...
A.java:4: error: cannot find symbol
public class A extends B {
                       ^
  symbol: class B
Standard Doclet version 1.8.0_45
Building tree for all the packages and classes...
Generating ./A.html...
Generating ./package-frame.html...
Generating ./package-summary.html...
Generating ./package-tree.html...
Generating ./constant-values.html...
Building index for all the packages and classes...
Generating ./overview-tree.html...
Generating ./index-all.html...
Generating ./deprecated-list.html...
Building index for all classes...
Generating ./allclasses-frame.html...
Generating ./allclasses-noframe.html...
Generating ./index.html...
Generating ./help-doc.html...
1 warning
{code}

However, if we have a javadoc break in comments as below:

{code}
/**
* Hi
* @see B
*/
public class A extends B {
}
{code}

this produces an error and warning.

{code}
Loading source file A.java...
Constructing Javadoc information...
A.java:5: error: cannot find symbol
public class A extends B {
                       ^
  symbol: class B
Standard Doclet version 1.8.0_45
Building tree for all the packages and classes...
Generating ./A.html...
A.java:3: error: reference not found
* @see B
       ^
Generating ./package-frame.html...
Generating ./package-summary.html...
Generating ./package-tree.html...
Generating ./constant-values.html...
Building index for all the packages and classes...
Generating ./overview-tree.html...
Generating ./index-all.html...
Generating ./deprecated-list.html...
Building index for all classes...
Generating ./allclasses-frame.html...
Generating ./allclasses-noframe.html...
Generating ./index.html...
Generating ./help-doc.html...
1 error
1 warning
{code}

It seems {{sbt unidoc}} recognises errors and also warnings as {{\[error\]}} when there are
breaks (the related context looks described in https://github.com/sbt/sbt/issues/875#issuecomment-24315400).

Given my observations so far, it is generally okay to just fix {{\[info\] # errors}} printed
at the bottom which are usually produced in generating the html {{Building tree for all the
packages and classes...}} phase.

Essentially, this looks a bug in GenJavaDoc which generates Java codes wrongly and a bug in
SBT that fails to distinguish warnings and errors in this case.

This message via Jenkins actually looks confusing.


  was:
Currently, when there are Javadoc breaks, this seems printing warnings as errors.

For example, the actual errors were as below in https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77070/consoleFull

{code}
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:4:
error: reference not found
[error]  * than both {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD} and
[error]                     ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:5:
error: reference not found
[error]  * {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD_BY_TIMES_AVERAGE} * averageSize.
It stores the
[error]           ^
{code}

but it also prints many errors from generated Java codes as below:

{code}
[info] Constructing Javadoc information...
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[error]   public   BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus,
org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient, org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
[error]                                                                                  
                                                                 ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[error]   public   BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient)  { throw new RuntimeException(); }
[error]                                                                                  
          ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:133:
error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[error]   private  org.apache.spark.internal.config.ConfigReader reader ()  { throw new RuntimeException();
}
[error]                                            ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:138:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
              ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:139:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
              ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:187:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
                       ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:188:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[error]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[error]                                                                                  
                       ^
[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:208:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[error]     org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?>
entry)  { throw new RuntimeException(); }
[error]                                                               
...
{code}

These errors are actually warnings in a successful build without Javadoc breaks as below -
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7/2908/consoleFull

{code}
[info] Constructing Javadoc information...
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[warn]   public   BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus,
org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient, org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
[warn]                                                                                   
                                                                ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
[warn]   public   BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient)  { throw new RuntimeException(); }
[warn]                                                                                   
         ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:133:
error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[warn]   private  org.apache.spark.internal.config.ConfigReader reader ()  { throw new RuntimeException();
}
[warn]                                            ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:138:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
             ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:139:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
             ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:187:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
                      ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:188:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
[warn]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
[warn]                                                                                   
                      ^
[warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:208:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
[warn]     org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?>
entry)  { throw new RuntimeException(); }
[warn]
...    
{code}

These look warnings not errors in {{javadoc}} but when we introduce a Javadoc break but it
seems sbt produces other warnings as errors when generating javadoc.

For example, with the Java code, {{A.java}}, below:

{code}
/**
* Hi
*/
public class A extends B {
}
{code}

if we run {{javadoc}}

{code}
javadoc A.java
{code}

it produces a warning because it does not find B symbol. It seems still generating the documenation
fine.

{code}
Loading source file A.java...
Constructing Javadoc information...
A.java:4: error: cannot find symbol
public class A extends B {
                       ^
  symbol: class B
Standard Doclet version 1.8.0_45
Building tree for all the packages and classes...
Generating ./A.html...
Generating ./package-frame.html...
Generating ./package-summary.html...
Generating ./package-tree.html...
Generating ./constant-values.html...
Building index for all the packages and classes...
Generating ./overview-tree.html...
Generating ./index-all.html...
Generating ./deprecated-list.html...
Building index for all classes...
Generating ./allclasses-frame.html...
Generating ./allclasses-noframe.html...
Generating ./index.html...
Generating ./help-doc.html...
1 warning
{code}

However, if we have a javadoc break in comments as below:

{code}
/**
* Hi
* @see B
*/
public class A extends B {
}
{code}

this produces an error and warning.

{code}
Loading source file A.java...
Constructing Javadoc information...
A.java:5: error: cannot find symbol
public class A extends B {
                       ^
  symbol: class B
Standard Doclet version 1.8.0_45
Building tree for all the packages and classes...
Generating ./A.html...
A.java:3: error: reference not found
* @see B
       ^
Generating ./package-frame.html...
Generating ./package-summary.html...
Generating ./package-tree.html...
Generating ./constant-values.html...
Building index for all the packages and classes...
Generating ./overview-tree.html...
Generating ./index-all.html...
Generating ./deprecated-list.html...
Building index for all classes...
Generating ./allclasses-frame.html...
Generating ./allclasses-noframe.html...
Generating ./index.html...
Generating ./help-doc.html...
1 error
1 warning
{code}

It seems {{sbt unidoc}} recognises errors and also warnings as {{\[error\]}} when there are
breaks (the related context looks described in https://github.com/sbt/sbt/issues/875#issuecomment-24315400).

Given my observations so far, it is generally okay to just fix {{\[info\] # errors}} printed
at the bottom which are usually produced in generating the html {{Building tree for all the
packages and classes...}} phase.

Essentially, this looks a bug in GenJavaDoc which generates Java codes wrongly and/or a bug
in SBT that fails to distinguish warnings and errors in this case.

This message via Jenkins actually looks confusing.



> Misleading spurious errors when there are Javadoc (Unidoc) breaks
> -----------------------------------------------------------------
>
>                 Key: SPARK-20840
>                 URL: https://issues.apache.org/jira/browse/SPARK-20840
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, Project Infra
>    Affects Versions: 2.2.0
>            Reporter: Hyukjin Kwon
>
> Currently, when there are Javadoc breaks, this seems printing warnings as errors.
> For example, the actual errors were as below in https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77070/consoleFull
> {code}
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:4:
error: reference not found
> [error]  * than both {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD} and
> [error]                     ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/HighlyCompressedMapStatus.java:5:
error: reference not found
> [error]  * {@link config.SHUFFLE_ACCURATE_BLOCK_THRESHOLD_BY_TIMES_AVERAGE} * averageSize.
It stores the
> [error]           ^
> {code}
> but it also prints many errors from generated Java codes as below:
> {code}
> [info] Constructing Javadoc information...
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
> [error]   public   BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus,
org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient, org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
> [error]                                                                             
                                                                      ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
> [error]   public   BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient)  { throw new RuntimeException(); }
> [error]                                                                             
               ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:133:
error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
> [error]   private  org.apache.spark.internal.config.ConfigReader reader ()  { throw new
RuntimeException(); }
> [error]                                            ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:138:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
> [error]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
> [error]                                                                             
                   ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:139:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
> [error]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
> [error]                                                                             
                   ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:187:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
> [error]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing
(org.apache.spark.internal.config.ConfigEntry<T> entry, T value)  { throw new RuntimeException();
}
> [error]                                                                             
                            ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:188:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
> [error]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing
(org.apache.spark.internal.config.OptionalConfigEntry<T> entry, T value)  { throw new
RuntimeException(); }
> [error]                                                                             
                            ^
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@2/core/target/java/org/apache/spark/SparkConf.java:208:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
> [error]     org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?>
entry)  { throw new RuntimeException(); }
> [error]                                                               
> ...
> {code}
> These errors are actually warnings in a successful build without Javadoc breaks as below
- https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7/2908/consoleFull
> {code}
> [info] Constructing Javadoc information...
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:117:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
> [warn]   public   BlacklistTracker (org.apache.spark.scheduler.LiveListenerBus listenerBus,
org.apache.spark.SparkConf conf, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient, org.apache.spark.util.Clock clock)  { throw new RuntimeException(); }
> [warn]                                                                              
                                                                     ^
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/scheduler/BlacklistTracker.java:118:
error: ExecutorAllocationClient is not public in org.apache.spark; cannot be accessed from
outside package
> [warn]   public   BlacklistTracker (org.apache.spark.SparkContext sc, scala.Option<org.apache.spark.ExecutorAllocationClient>
allocationClient)  { throw new RuntimeException(); }
> [warn]                                                                              
              ^
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:133:
error: ConfigReader is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
> [warn]   private  org.apache.spark.internal.config.ConfigReader reader ()  { throw new
RuntimeException(); }
> [warn]                                            ^
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:138:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
> [warn]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.ConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
> [warn]                                                                              
                  ^
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:139:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
> [warn]    <T extends java.lang.Object> org.apache.spark.SparkConf set (org.apache.spark.internal.config.OptionalConfigEntry<T>
entry, T value)  { throw new RuntimeException(); }
> [warn]                                                                              
                  ^
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:187:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
> [warn]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing
(org.apache.spark.internal.config.ConfigEntry<T> entry, T value)  { throw new RuntimeException();
}
> [warn]                                                                              
                           ^
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:188:
error: OptionalConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed
from outside package
> [warn]    <T extends java.lang.Object> org.apache.spark.SparkConf setIfMissing
(org.apache.spark.internal.config.OptionalConfigEntry<T> entry, T value)  { throw new
RuntimeException(); }
> [warn]                                                                              
                           ^
> [warn] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/core/target/java/org/apache/spark/SparkConf.java:208:
error: ConfigEntry is not public in org.apache.spark.internal.config; cannot be accessed from
outside package
> [warn]     org.apache.spark.SparkConf remove (org.apache.spark.internal.config.ConfigEntry<?>
entry)  { throw new RuntimeException(); }
> [warn]
> ...    
> {code}
> These look warnings not errors in {{javadoc}} but when we introduce a Javadoc break but
it seems sbt produces other warnings as errors when generating javadoc.
> For example, with the Java code, {{A.java}}, below:
> {code}
> /**
> * Hi
> */
> public class A extends B {
> }
> {code}
> if we run {{javadoc}}
> {code}
> javadoc A.java
> {code}
> it produces a warning because it does not find B symbol. It seems still generating the
documenation fine.
> {code}
> Loading source file A.java...
> Constructing Javadoc information...
> A.java:4: error: cannot find symbol
> public class A extends B {
>                        ^
>   symbol: class B
> Standard Doclet version 1.8.0_45
> Building tree for all the packages and classes...
> Generating ./A.html...
> Generating ./package-frame.html...
> Generating ./package-summary.html...
> Generating ./package-tree.html...
> Generating ./constant-values.html...
> Building index for all the packages and classes...
> Generating ./overview-tree.html...
> Generating ./index-all.html...
> Generating ./deprecated-list.html...
> Building index for all classes...
> Generating ./allclasses-frame.html...
> Generating ./allclasses-noframe.html...
> Generating ./index.html...
> Generating ./help-doc.html...
> 1 warning
> {code}
> However, if we have a javadoc break in comments as below:
> {code}
> /**
> * Hi
> * @see B
> */
> public class A extends B {
> }
> {code}
> this produces an error and warning.
> {code}
> Loading source file A.java...
> Constructing Javadoc information...
> A.java:5: error: cannot find symbol
> public class A extends B {
>                        ^
>   symbol: class B
> Standard Doclet version 1.8.0_45
> Building tree for all the packages and classes...
> Generating ./A.html...
> A.java:3: error: reference not found
> * @see B
>        ^
> Generating ./package-frame.html...
> Generating ./package-summary.html...
> Generating ./package-tree.html...
> Generating ./constant-values.html...
> Building index for all the packages and classes...
> Generating ./overview-tree.html...
> Generating ./index-all.html...
> Generating ./deprecated-list.html...
> Building index for all classes...
> Generating ./allclasses-frame.html...
> Generating ./allclasses-noframe.html...
> Generating ./index.html...
> Generating ./help-doc.html...
> 1 error
> 1 warning
> {code}
> It seems {{sbt unidoc}} recognises errors and also warnings as {{\[error\]}} when there
are breaks (the related context looks described in https://github.com/sbt/sbt/issues/875#issuecomment-24315400).
> Given my observations so far, it is generally okay to just fix {{\[info\] # errors}}
printed at the bottom which are usually produced in generating the html {{Building tree for
all the packages and classes...}} phase.
> Essentially, this looks a bug in GenJavaDoc which generates Java codes wrongly and a
bug in SBT that fails to distinguish warnings and errors in this case.
> This message via Jenkins actually looks confusing.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message