phoenix-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Josh Mahonin (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (PHOENIX-4159) phoenix-spark tests are failing
Date Tue, 05 Sep 2017 20:24:00 GMT

    [ https://issues.apache.org/jira/browse/PHOENIX-4159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16154235#comment-16154235
] 

Josh Mahonin commented on PHOENIX-4159:
---------------------------------------

Quick update, I believe it has to do with improperly handling the underlying 'tmpFolder' ClassRule
across the two phoenix-spark integration tests. I suspect this came about when PHOENIX-3532
went in, though there were other ongoing build issues so it was ignored. It doesn't affect
the integration tests when run individually and directly from IntellIJ, but it does come up
when run in parallel through maven. Will try tackle ASAP.

> phoenix-spark tests are failing
> -------------------------------
>
>                 Key: PHOENIX-4159
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-4159
>             Project: Phoenix
>          Issue Type: Bug
>            Reporter: Samarth Jain
>
> In few of the runs where we were able to get successful test runs for phoenix-core, we
ran into failures for the phoenix-spark module. 
> Sample run - https://builds.apache.org/job/Phoenix-master/1762/console
> [~jmahonin] - would you mind taking a look. Copy pasting here a possibly relevant stacktrace
in case the link is no longer working:
> {code}
> Formatting using clusterid: testClusterID
> 1    [ScalaTest-4] ERROR org.apache.hadoop.hdfs.MiniDFSCluster  - IOE creating namenodes.
Permissions dump:
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/data':

> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/data
> 	permissions: ----
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs':

> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386':

> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b':

> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data':

> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace/Phoenix-master': 
> 	absolute:/home/jenkins/jenkins-slave/workspace/Phoenix-master
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave/workspace': 
> 	absolute:/home/jenkins/jenkins-slave/workspace
> 	permissions: drwx
> path '/home/jenkins/jenkins-slave': 
> 	absolute:/home/jenkins/jenkins-slave
> 	permissions: drwx
> path '/home/jenkins': 
> 	absolute:/home/jenkins
> 	permissions: drwx
> path '/home': 
> 	absolute:/home
> 	permissions: dr-x
> path '/': 
> 	absolute:/
> 	permissions: dr-x
> java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current
> 	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
> 	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
> 	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
> 	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
> 	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
> 	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
> 	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:625)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1022)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:903)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:885)
> 	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:872)
> 	at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:522)
> 	at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:442)
> 	at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:424)
> 	at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:495)
> 	at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:490)
> 	at org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
> 	at org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
> 	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
> 	at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
> 	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
> 	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
> 	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.run(AbstractPhoenixSparkIT.scala:44)
> 	at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
> Exception encountered when invoking run on a nested suite - java.io.IOException: Cannot
create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current
*** ABORTED ***
>   java.lang.RuntimeException: java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current
>   at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:525)
>   at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:442)
>   at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:424)
>   at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:495)
>   at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:490)
>   at org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
>   at org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
>   at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
>   at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
>   at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
>   ...
>   Cause: java.io.IOException: Cannot create directory /home/jenkins/jenkins-slave/workspace/Phoenix-master/phoenix-spark/target/test-data/fa615cb3-a0d9-4c9e-90eb-acd0c7d46d9b/dfscluster_1ce5f5c4-f355-4111-a763-4ab777941386/dfs/name1/current
>   at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
>   at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
>   at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
>   at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
>   at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
>   at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
>   at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
>   at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
>   at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
>   at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
>   ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message