hadoop-hdfs-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Haridas Kandath (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (HDDS-320) Failed to start container with apache/hadoop-runner image.
Date Fri, 19 Oct 2018 13:54:00 GMT

    [ https://issues.apache.org/jira/browse/HDDS-320?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16656832#comment-16656832
] 

Haridas Kandath edited comment on HDDS-320 at 10/19/18 1:53 PM:
----------------------------------------------------------------

Tried the above steps. Running the smoke tests gives  the following 

-------------------------------------------------
Executing test(s): [basic]

  Cluster type:      ozone
  Compose file:      /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/../compose/ozone/docker-compose.yaml
  Output dir:        /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/result
  Command to rerun:  ./test.sh --keep --env ozone basic
-------------------------------------------------
Removing ozone_ozoneManager_1 ... done
Removing ozone_scm_1          ... done
Removing ozone_datanode_1     ... done
Removing network ozone_default
Creating network "ozone_default" with the default driver
Creating ozone_ozoneManager_1 ... done
Creating ozone_datanode_1     ... done
Creating ozone_scm_1          ... done
Waiting 30s for cluster start up...
ERROR: No container found for datanode_1
Removing ozone_scm_1          ... done
Removing ozone_datanode_1     ... done
Removing ozone_ozoneManager_1 ... done
Removing network ozone_default
-------------------------------------------------
Executing test(s): [ozonefs]

  Cluster type:      ozonefs
  Compose file:      /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/../compose/ozonefs/docker-compose.yaml
  Output dir:        /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/result
  Command to rerun:  ./test.sh --keep --env ozonefs ozonefs
-------------------------------------------------
Stopping ozonefs_hadooplast_1 ... done
Removing ozonefs_ozoneManager_1 ... done
Removing ozonefs_scm_1          ... done
Removing ozonefs_datanode_1     ... done
Removing ozonefs_hadooplast_1   ... done
Removing network ozonefs_default
Creating network "ozonefs_default" with the default driver
Creating ozonefs_ozoneManager_1 ... done
Creating ozonefs_hadooplast_1   ... done
Creating ozonefs_datanode_1     ... done
Creating ozonefs_scm_1          ... done
Waiting 30s for cluster start up...
ERROR: No container found for datanode_1
Stopping ozonefs_hadooplast_1 ... done
Removing ozonefs_datanode_1     ... done
Removing ozonefs_scm_1          ... done
Removing ozonefs_hadooplast_1   ... done
Removing ozonefs_ozoneManager_1 ... done
Removing network ozonefs_default
-------------------------------------------------
Executing test(s): [s3]

  Cluster type:      ozones3
  Compose file:      /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/../compose/ozones3/docker-compose.yaml
  Output dir:        /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/result
  Command to rerun:  ./test.sh --keep --env ozones3 s3
-------------------------------------------------
Removing ozones3_scm_1          ... done
Removing ozones3_s3g_1          ... done
Removing ozones3_datanode_1     ... done
Removing ozones3_ozoneManager_1 ... done
Removing network ozones3_default
Creating network "ozones3_default" with the default driver
Creating ozones3_ozoneManager_1 ... done
Creating ozones3_s3g_1          ... done
Creating ozones3_datanode_1     ... done
Creating ozones3_scm_1          ... done
Waiting 30s for cluster start up...
ERROR: No container found for datanode_1
Removing ozones3_datanode_1     ... done
Removing ozones3_scm_1          ... done
Removing ozones3_s3g_1          ... done
Removing ozones3_ozoneManager_1 ... done
Removing network ozones3_default
Setting up environment!
[ ERROR ] Reading XML source 'smoketest/result/robot-*.xml' failed: No such file or directory


*Logs in result/docker-ozone-basic.log
*
Attaching to ozone_scm_1, ozone_datanode_1, ozone_ozoneManager_1
scm_1           | Traceback (most recent call last):
scm_1           |   File "/opt/envtoconf.py", line 104, in <module>
scm_1           |     Simple(sys.argv[1:]).main()
scm_1           |   File "/opt/envtoconf.py", line 93, in main
scm_1           |     self.process_envs()
scm_1           |   File "/opt/envtoconf.py", line 67, in process_envs
scm_1           |     with open(self.destination_file_path(name, extension) + ".raw", "w")
as myfile:
scm_1           | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
ozoneManager_1  | Traceback (most recent call last):
ozoneManager_1  |   File "/opt/envtoconf.py", line 104, in <module>
ozoneManager_1  |     Simple(sys.argv[1:]).main()
ozoneManager_1  |   File "/opt/envtoconf.py", line 93, in main
ozoneManager_1  |     self.process_envs()
ozoneManager_1  |   File "/opt/envtoconf.py", line 67, in process_envs
ozoneManager_1  |     with open(self.destination_file_path(name, extension) + ".raw", "w")
as myfile:
ozoneManager_1  | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
datanode_1      | Traceback (most recent call last):
datanode_1      |   File "/opt/envtoconf.py", line 104, in <module>
datanode_1      |     Simple(sys.argv[1:]).main()
datanode_1      |   File "/opt/envtoconf.py", line 93, in main
datanode_1      |     self.process_envs()
datanode_1      |   File "/opt/envtoconf.py", line 67, in process_envs
datanode_1      |     with open(self.destination_file_path(name, extension) + ".raw", "w")
as myfile:
datanode_1      | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'


was (Author: haridas124):
Tried the above steps. Running the smoke tests gives  the following 

-------------------------------------------------
Executing test(s): [basic]

  Cluster type:      ozone
  Compose file:      /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/../compose/ozone/docker-compose.yaml
  Output dir:        /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/result
  Command to rerun:  ./test.sh --keep --env ozone basic
-------------------------------------------------
Removing ozone_ozoneManager_1 ... done
Removing ozone_scm_1          ... done
Removing ozone_datanode_1     ... done
Removing network ozone_default
Creating network "ozone_default" with the default driver
Creating ozone_ozoneManager_1 ... done
Creating ozone_datanode_1     ... done
Creating ozone_scm_1          ... done
Waiting 30s for cluster start up...
ERROR: No container found for datanode_1
Removing ozone_scm_1          ... done
Removing ozone_datanode_1     ... done
Removing ozone_ozoneManager_1 ... done
Removing network ozone_default
-------------------------------------------------
Executing test(s): [ozonefs]

  Cluster type:      ozonefs
  Compose file:      /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/../compose/ozonefs/docker-compose.yaml
  Output dir:        /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/result
  Command to rerun:  ./test.sh --keep --env ozonefs ozonefs
-------------------------------------------------
Stopping ozonefs_hadooplast_1 ... done
Removing ozonefs_ozoneManager_1 ... done
Removing ozonefs_scm_1          ... done
Removing ozonefs_datanode_1     ... done
Removing ozonefs_hadooplast_1   ... done
Removing network ozonefs_default
Creating network "ozonefs_default" with the default driver
Creating ozonefs_ozoneManager_1 ... done
Creating ozonefs_hadooplast_1   ... done
Creating ozonefs_datanode_1     ... done
Creating ozonefs_scm_1          ... done
Waiting 30s for cluster start up...
ERROR: No container found for datanode_1
Stopping ozonefs_hadooplast_1 ... done
Removing ozonefs_datanode_1     ... done
Removing ozonefs_scm_1          ... done
Removing ozonefs_hadooplast_1   ... done
Removing ozonefs_ozoneManager_1 ... done
Removing network ozonefs_default
-------------------------------------------------
Executing test(s): [s3]

  Cluster type:      ozones3
  Compose file:      /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/../compose/ozones3/docker-compose.yaml
  Output dir:        /home/haridas/projects/additional/hadoop/hadoop-ozone/dist/target/ozone-0.4.0-SNAPSHOT/smoketest/result
  Command to rerun:  ./test.sh --keep --env ozones3 s3
-------------------------------------------------
Removing ozones3_scm_1          ... done
Removing ozones3_s3g_1          ... done
Removing ozones3_datanode_1     ... done
Removing ozones3_ozoneManager_1 ... done
Removing network ozones3_default
Creating network "ozones3_default" with the default driver
Creating ozones3_ozoneManager_1 ... done
Creating ozones3_s3g_1          ... done
Creating ozones3_datanode_1     ... done
Creating ozones3_scm_1          ... done
Waiting 30s for cluster start up...
ERROR: No container found for datanode_1
Removing ozones3_datanode_1     ... done
Removing ozones3_scm_1          ... done
Removing ozones3_s3g_1          ... done
Removing ozones3_ozoneManager_1 ... done
Removing network ozones3_default
Setting up environment!
[ ERROR ] Reading XML source 'smoketest/result/robot-*.xml' failed: No such file or directory


> Failed to start container with apache/hadoop-runner image.
> ----------------------------------------------------------
>
>                 Key: HDDS-320
>                 URL: https://issues.apache.org/jira/browse/HDDS-320
>             Project: Hadoop Distributed Data Store
>          Issue Type: Bug
>          Components: document
>         Environment: centos 7.4
>            Reporter: Junjie Chen
>            Priority: Minor
>
> Following the doc in hadoop-ozone/doc/content/GettingStarted.md, the docker-compose up
-d step failed, the error list list below:
> [root@VM_16_5_centos ozone]# docker-compose logs
> Attaching to ozone_scm_1, ozone_datanode_1, ozone_ozoneManager_1
> datanode_1      | Traceback (most recent call last):
> datanode_1      |   File "/opt/envtoconf.py", line 104, in <module>
> datanode_1      |     Simple(sys.argv[1:]).main()
> datanode_1      |   File "/opt/envtoconf.py", line 93, in main
> datanode_1      |     self.process_envs()
> datanode_1      |   File "/opt/envtoconf.py", line 67, in process_envs
> datanode_1      |     with open(self.destination_file_path(name, extension) + ".raw",
"w") as myfile:
> datanode_1      | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
> datanode_1      | Traceback (most recent call last):
> datanode_1      |   File "/opt/envtoconf.py", line 104, in <module>
> datanode_1      |     Simple(sys.argv[1:]).main()
> datanode_1      |   File "/opt/envtoconf.py", line 93, in main
> datanode_1      |     self.process_envs()
> datanode_1      |   File "/opt/envtoconf.py", line 67, in process_envs
> datanode_1      |     with open(self.destination_file_path(name, extension) + ".raw",
"w") as myfile:
> ozoneManager_1  |     with open(self.destination_file_path(name, extension) + ".raw",
"w") as myfile:
> ozoneManager_1  | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
> ozoneManager_1  | Traceback (most recent call last):
> ozoneManager_1  |   File "/opt/envtoconf.py", line 104, in <module>
> ozoneManager_1  |     Simple(sys.argv[1:]).main()
> ozoneManager_1  |   File "/opt/envtoconf.py", line 93, in main
> ozoneManager_1  |     self.process_envs()
> ozoneManager_1  |   File "/opt/envtoconf.py", line 67, in process_envs              
                                              
> ozoneManager_1  |     with open(self.destination_file_path(name, extension) + ".raw",
"w") as myfile:                              
> ozoneManager_1  | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
                            
> scm_1           | Traceback (most recent call last):
> scm_1           |   File "/opt/envtoconf.py", line 104, in <module>           
                                                    
> scm_1           |     Simple(sys.argv[1:]).main()
> scm_1           |   File "/opt/envtoconf.py", line 93, in main
> scm_1           |     self.process_envs()
> scm_1           |   File "/opt/envtoconf.py", line 67, in process_envs              
                                              
> scm_1           |     with open(self.destination_file_path(name, extension) + ".raw",
"w") as myfile:                              
> scm_1           | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
                            
> scm_1           | Traceback (most recent call last):
> scm_1           |   File "/opt/envtoconf.py", line 104, in <module>           
                                                    
> scm_1           |     Simple(sys.argv[1:]).main()
> scm_1           |   File "/opt/envtoconf.py", line 93, in main
> scm_1           |     self.process_envs()
> scm_1           |   File "/opt/envtoconf.py", line 67, in process_envs              
                                              
> scm_1           |     with open(self.destination_file_path(name, extension) + ".raw",
"w") as myfile:                              
> scm_1           | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
                            
> scm_1           | Traceback (most recent call last):
> scm_1           |   File "/opt/envtoconf.py", line 104, in <module>           
                                                    
> scm_1           |     Simple(sys.argv[1:]).main()
> scm_1           |   File "/opt/envtoconf.py", line 93, in main
> scm_1           |     self.process_envs()
> scm_1           |   File "/opt/envtoconf.py", line 67, in process_envs              
                                              
> scm_1           |     with open(self.destination_file_path(name, extension) + ".raw",
"w") as myfile:                              
> scm_1           | IOError: [Errno 13] Permission denied: '/opt/hadoop/etc/hadoop/log4j.properties.raw'
  
> my docker-compose version is:
> docker-compose version 1.22.0, build f46880fe
> docker images:
> apache/hadoop-runner   latest              569314fd9a73        5 weeks ago         646MB
> From the Dockerfile, we can see " chown hadoop /opt" command. It looks like we need a
"-R " here?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-help@hadoop.apache.org


Mime
View raw message