Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 077BDF9D0 for ; Fri, 26 Apr 2013 18:24:00 +0000 (UTC) Received: (qmail 18085 invoked by uid 500); 26 Apr 2013 18:23:54 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 17960 invoked by uid 500); 26 Apr 2013 18:23:54 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 17940 invoked by uid 99); 26 Apr 2013 18:23:54 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 26 Apr 2013 18:23:54 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of mohsen.bsarmadi@gmail.com designates 209.85.212.45 as permitted sender) Received: from [209.85.212.45] (HELO mail-vb0-f45.google.com) (209.85.212.45) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 26 Apr 2013 18:23:50 +0000 Received: by mail-vb0-f45.google.com with SMTP id p14so3807584vbm.4 for ; Fri, 26 Apr 2013 11:23:29 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=x-received:mime-version:from:date:message-id:subject:to :content-type; bh=UNAe3teogikHOWPap5mRKco9kn8SszW3BfZSJbhusbQ=; b=UZpfhzNorw7+4vo5OhsjOjrh2krhdIwb478nu1sp4Umg4npweIG1UNrVlbLQ/CBXgs o2zRZSmIwbD0dU1s9AhSl8yuTY/o1HRFP764OLDnpln8j2X8lS/X3go2meW/53unPaC/ MaRf1FfBSMUQrIMGKnwswcAVEo8rC3cEhwHys+EtD56JV29u0O7zvsWMZc1YGLWMyUGi udnd6JOIMAJAfWlD0h31GfQrMRypJDujz+tcm8p+snwDFizFhfD98yIQv01I8TkTqgvv 3sbRxPBeS6eknVfoWdE5k4ESi35k8eLs7Y+jwxRV7NgKbgwOI/xJ/KvKVph8Z5I0okh1 naoQ== X-Received: by 10.52.111.100 with SMTP id ih4mr95237vdb.98.1367000609542; Fri, 26 Apr 2013 11:23:29 -0700 (PDT) MIME-Version: 1.0 Received: by 10.58.146.200 with HTTP; Fri, 26 Apr 2013 11:23:08 -0700 (PDT) From: "Mohsen B.Sarmadi" Date: Fri, 26 Apr 2013 19:23:08 +0100 Message-ID: Subject: To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=bcaec54862fabd5e0604db47a12c X-Virus-Checked: Checked by ClamAV on apache.org --bcaec54862fabd5e0604db47a12c Content-Type: text/plain; charset=ISO-8859-1 Hi, I am newbi in hadoop, I am running hadoop on Mac X 10. and i can't load any files in Hdfs. first of all, i am getting this error localhost: 2013-04-26 19:08:31.330 java[14436:1b03] Unable to load realm info from SCDynamicStore which from some posts i understand i should add this line to hadoop-env.sh. but it didn't fix it. export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK-Djava.security.krb5.kdc=kdc0.ox.ac.uk: kdc1.ox.ac.uk" second , i can't load any files in Hdfs. i am trying to run hadoop in pseudo-distributed mode so i used configuration from here for it. i am sure hadoop is loading my configurations because i have added a java home into hadoop-env.sh successfully. this is error i get: m0h3n:hadoop-1.0.4 mohsen$ ./bin/hadoop dfs -put conf input 2013-04-26 19:18:04.185 java[14559:1703] Unable to load realm info from SCDynamicStore 13/04/26 19:18:04 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/mohsen/input/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558) at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382) at org.apache.hadoop.ipc.Client.call(Client.java:1070) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3510) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3373) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829) 13/04/26 19:18:04 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null 13/04/26 19:18:04 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/mohsen/input/capacity-scheduler.xml" - Aborting... put: java.io.IOException: File /user/mohsen/input/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1 13/04/26 19:18:04 ERROR hdfs.DFSClient: Exception closing file /user/mohsen/input/capacity-scheduler.xml : org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/mohsen/input/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558) at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382) org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/mohsen/input/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1558) at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382) at org.apache.hadoop.ipc.Client.call(Client.java:1070) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3510) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3373) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829) -- Mohsen B.S this is the result of jps. m0h3n:hadoop-1.0.4 mohsen$ jps 357 14588 Jps 14436 TaskTracker 14261 SecondaryNameNode 14059 NameNode 14335 JobTracker Please help me to overcome this problem regards Mohsen --bcaec54862fabd5e0604db47a12c Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi,=A0
I am newbi in hadoop,
I am running hadoop on Mac X 10. and i=A0can't=A0load any= files in Hdfs.

first of all, i am get= ting this error

localhost: 2013-04-26 19:08:31.330 java[1443= 6:1b03] Unable to load realm info from SCDynamicStore

<= div style>which from some posts i understand i should add this line to hado= op-env.sh. but it didn't fix it.=A0

export HADOOP_OPTS= =3D"-Djava.security.krb5.realm=3DOX= .AC.UK -Djava.security.krb5.kdc=3Dkdc0.ox.ac.uk:kdc1.ox.ac.uk"

second , i=A0can't=A0load any=A0files in Hdfs.= =A0i am trying to run hadoop in=A0pseudo-dis= tributed mode=A0so i used configurati= on from=A0here=A0for it.
= i am sure hadoop is loading my configurati= ons because i have added a java home into=A0hadoop-env.sh=A0s= uccessfully.
this is error i get:

m= 0h3n:hadoop-1.0.4 mohsen$ ./bin/hadoop dfs -put conf input
2013-0= 4-26 19:18:04.185 java[14559:1703] Unable to load realm info from SCDynamic= Store
13/04/26 19:18:04 WARN hdfs.DFSClient: DataStreamer Exception: org.apa= che.hadoop.ipc.RemoteException: java.io.IOException: File /user/mohsen/inpu= t/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1<= /div>
at org.apache.hadoo= p.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:15= 58)
at org.apa= che.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
at sun.reflect.Gene= ratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.i= nvoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflec= t.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoo= p.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.ru= n(Server.java:1384)
at java.security.Ac= cessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.ja= va:415)
at org.apache.hadoo= p.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
<= div> at org.apache.hadoop= .ipc.Server$Handler.run(Server.java:1382)

at o= rg.apache.hadoop.ipc.Client.call(Client.java:1070)
at org.apache.hadoop.ipc.RPC$Invoker.i= nvoke(RPC.java:225)
at com.sun.proxy.$P= roxy1.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Met= hod)
at sun.reflect.Nati= veMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMe= thodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflec= t.Method.invoke(Method.java:601)
at org.apache.hadoop.io.retry.RetryInvocationHandler.inv= okeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoo= p.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy= .$Proxy1.addBlock(Unknown Source)
at org.apache.hadoo= p.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3510)<= /div>
at org.apache.= hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:= 3373)
at org.apache.hadoo= p.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589)
at org.apache.hadoop.hd= fs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829)

13/04/26 19:18:04 WARN hdfs.DFSClient: Error Recovery f= or block null bad datanode[0] nodes =3D=3D null
13/04/26 19:18:04= WARN hdfs.DFSClient: Could not get block locations. Source file "/use= r/mohsen/input/capacity-scheduler.xml" - Aborting...
put: java.io.IOException: File /user/mohsen/input/capacity-scheduler.x= ml could only be replicated to 0 nodes, instead of 1
13/04/26 19:= 18:04 ERROR hdfs.DFSClient: Exception closing file /user/mohsen/input/capac= ity-scheduler.xml : org.apache.hadoop.ipc.RemoteException: java.io.IOExcept= ion: File /user/mohsen/input/capacity-scheduler.xml could only be replicate= d to 0 nodes, instead of 1
at org.apache.hadoo= p.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:15= 58)
at org.apa= che.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
at sun.reflect.Gene= ratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.i= nvoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflec= t.Method.invoke(Method.java:601)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoo= p.ipc.Server$Handler$1.run(Server.java:1388)
at org.apache.hadoop.ipc.Server$Handler$1.ru= n(Server.java:1384)
at java.security.Ac= cessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.ja= va:415)
at org.apache.hadoo= p.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
<= div> at org.apache.hadoop= .ipc.Server$Handler.run(Server.java:1382)

org.apache.hadoop.ipc.RemoteException: java.io.IOExcept= ion: File /user/mohsen/input/capacity-scheduler.xml could only be replicate= d to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAddi= tionalBlock(FSNamesystem.java:1558)
at org.apache.hadoo= p.hdfs.server.namenode.NameNode.addBlock(NameNode.java:696)
at sun.reflect.GeneratedMetho= dAccessor6.invoke(Unknown Source)
at sun.reflect.Dele= gatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
=
at java.lang.reflec= t.Method.invoke(Method.java:601)
at org.apache.hadoo= p.ipc.RPC$Server.call(RPC.java:563)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.= java:1388)
at org.apache.hadoo= p.ipc.Server$Handler$1.run(Server.java:1384)
at java.security.AccessController.doPrivileg= ed(Native Method)
at javax.security.a= uth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1121)
at org.apache.hadoo= p.ipc.Server$Handler.run(Server.java:1382)

at org.apache.hadoop.ipc.Client= .call(Client.java:1070)
at org.apache.hadoo= p.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.addBlock(Unknown Source)<= /div>
at sun.reflect.Nati= veMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invok= e(NativeMethodAccessorImpl.java:57)
at sun.reflect.Dele= gatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
=
at java.lang.reflec= t.Method.invoke(Method.java:601)
at org.apache.hadoo= p.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:= 82)
at org.apa= che.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.ja= va:59)
at com.sun.proxy.$P= roxy1.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locate= FollowingBlock(DFSClient.java:3510)
at org.apache.hadoo= p.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3373)=
at org.apache= .hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589)
at org.apache.hadoo= p.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829)
--
Mohsen B.S


this is the result of jps.=A0
m0h3n:hadoop-1.0.4 mohsen$ jps
357=A0
14588= Jps
14436 TaskTracker
14261 SecondaryNameNode
14059 NameNode
14335 JobTracker


Please help me to overcome this problem

regards
Mohsen=A0
--bcaec54862fabd5e0604db47a12c--