spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-17591) Fix/investigate the failure of tests in Scala On Windows
Date Mon, 21 Nov 2016 05:27:58 GMT

    [ https://issues.apache.org/jira/browse/SPARK-17591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15682578#comment-15682578
] 

Hyukjin Kwon commented on SPARK-17591:
--------------------------------------

I will close this when I am able to proceed further the tests on Windows and to see more error
logs rather then the ones described in the description.

> Fix/investigate the failure of tests in Scala On Windows
> --------------------------------------------------------
>
>                 Key: SPARK-17591
>                 URL: https://issues.apache.org/jira/browse/SPARK-17591
>             Project: Spark
>          Issue Type: Test
>          Components: Build, DStreams, Spark Core, SQL
>            Reporter: Hyukjin Kwon
>
> {code}
> Tests run: 90, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 17.53 sec <<<
FAILURE! - in org.apache.spark.JavaAPISuite
> wholeTextFiles(org.apache.spark.JavaAPISuite)  Time elapsed: 0.313 sec  <<<
FAILURE!
> java.lang.AssertionError: 
> expected:<spark is easy to use.
> > but was:<null>
> 	at org.apache.spark.JavaAPISuite.wholeTextFiles(JavaAPISuite.java:1089)
> {code}
> {code}
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec <<<
FAILURE! - in org.apache.spark.launcher.SparkLauncherSuite
> testChildProcLauncher(org.apache.spark.launcher.SparkLauncherSuite)  Time elapsed: 0.047
sec  <<< FAILURE!
> java.lang.AssertionError: expected:<0> but was:<1>
> 	at org.apache.spark.launcher.SparkLauncherSuite.testChildProcLauncher(SparkLauncherSuite.java:177)
> {code}
> {code}
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<<
FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed:
3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
> 	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> Running org.apache.spark.streaming.JavaDurationSuite
> {code}
> {code}
> Running org.apache.spark.streaming.JavaAPISuite
> Tests run: 53, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 22.325 sec <<<
FAILURE! - in org.apache.spark.streaming.JavaAPISuite
> testCheckpointMasterRecovery(org.apache.spark.streaming.JavaAPISuite)  Time elapsed:
3.418 sec  <<< ERROR!
> java.io.IOException: Failed to delete: C:\projects\spark\streaming\target\tmp\1474255953021-0
> 	at org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery(JavaAPISuite.java:1808)
> {code}
> {code}
> Results :
> Tests in error: 
>   JavaAPISuite.testCheckpointMasterRecovery:1808 � IO Failed to delete: C:\proje...
> Tests run: 74, Failures: 0, Errors: 1, Skipped: 0
> {code}
> The tests were aborted for unknown reason during SQL tests - {{BroadcastJoinSuite}} emitting
the exceptions below continuously:
> {code}
> 20:48:09.876 ERROR org.apache.spark.deploy.worker.ExecutorRunner: Error running executor
> java.io.IOException: Cannot run program "C:\Progra~1\Java\jdk1.8.0\bin\java" (in directory
"C:\projects\spark\work\app-20160918204809-0000\0"): CreateProcess error=206, The filename
or extension is too long
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
> 	at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:167)
> 	at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:73)
> Caused by: java.io.IOException: CreateProcess error=206, The filename or extension is
too long
> 	at java.lang.ProcessImpl.create(Native Method)
> 	at java.lang.ProcessImpl.<init>(ProcessImpl.java:386)
> 	at java.lang.ProcessImpl.start(ProcessImpl.java:137)
> 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
> 	... 2 more
> {code}
> Here is the full log for the test - https://ci.appveyor.com/project/spark-test/spark/build/15-scala-tests
> We may have to create sub-tasks if these are actual issues on Windows rather than just
mistakes in tests.
> I am willing to test this again after fixing some issues here in particular the last
one.
> I trigger the build by the comments below:
> {code}
> mvn -DskipTests -Phadoop-2.6 -Phive -Phive-thriftserver package
> mvn -Phadoop-2.6 -Phive -Phive-thriftserver --fail-never test
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message