hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sudhakara st <sudhakara...@gmail.com>
Subject Re:
Date Sat, 27 Apr 2013 19:23:44 GMT
Hello,

Your datanode daemon is not running, please check data node logs.


On Fri, Apr 26, 2013 at 11:53 PM, Mohsen B.Sarmadi <
mohsen.bsarmadi@gmail.com> wrote:

> Hi,
> I am newbi in hadoop,
> I am running hadoop on Mac X 10. and i can't load any files in Hdfs.
>
> first of all, i am getting this error
>
> localhost: 2013-04-26 19:08:31.330 java[14436:1b03] Unable to load realm
> info from SCDynamicStore
>
> which from some posts i understand i should add this line to
> hadoop-env.sh. but it didn't fix it.
>
> export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK-Djava.security.krb5.kdc=kdc0.ox.ac.uk:
> kdc1.ox.ac.uk"
> second , i can't load any files in Hdfs. i am trying to run hadoop in pseudo-distributed
> mode so i used configuration from here<http://hadoop.apache.org/docs/r1.0.4/single_node_setup.html>
for
> it.
> i am sure hadoop is loading my configurations because i have added a java
> home into hadoop-env.sh successfully.
> this is error i get:
>
> m0h3n:hadoop-1.0.4 mohsen$ ./bin/hadoop dfs -put conf input
> 2013-04-26 19:18:04.185 java[14559:1703] Unable to load realm info from
> SCDynamicStore
> 13/04/26 19:18:04 WARN hdfs.DFSClient: DataStreamer Exception:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/mohsen/input/capacity-scheduler.xml could only be replicated to 0
> nodes, instead of 1
>  at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
>  at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>  at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1070)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>  at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:601)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
>  at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3510)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3373)
>  at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829)
>
> 13/04/26 19:18:04 WARN hdfs.DFSClient: Error Recovery for block null bad
> datanode[0] nodes == null
> 13/04/26 19:18:04 WARN hdfs.DFSClient: Could not get block locations.
> Source file "/user/mohsen/input/capacity-scheduler.xml" - Aborting...
> put: java.io.IOException: File /user/mohsen/input/capacity-scheduler.xml
> could only be replicated to 0 nodes, instead of 1
> 13/04/26 19:18:04 ERROR hdfs.DFSClient: Exception closing file
> /user/mohsen/input/capacity-scheduler.xml :
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/mohsen/input/capacity-scheduler.xml could only be replicated to 0
> nodes, instead of 1
>  at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
>  at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>  at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
>  at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/mohsen/input/capacity-scheduler.xml could only be replicated to 0
> nodes, instead of 1
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558)
>  at
> org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
> at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
>  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>  at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1070)
>  at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
> at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>  at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3510)
>  at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3373)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589)
>  at
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829)
> --
> Mohsen B.S
>
>
> this is the result of jps.
> m0h3n:hadoop-1.0.4 mohsen$ jps
> 357
> 14588 Jps
> 14436 TaskTracker
> 14261 SecondaryNameNode
> 14059 NameNode
> 14335 JobTracker
>
>
> Please help me to overcome this problem
>
> regards
> Mohsen
>



-- 

Regards,
.....  Sudhakara.st

Mime
View raw message