Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id D321210C11 for ; Thu, 13 Mar 2014 08:12:55 +0000 (UTC) Received: (qmail 99086 invoked by uid 500); 13 Mar 2014 08:12:49 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 97817 invoked by uid 500); 13 Mar 2014 08:12:47 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 97787 invoked by uid 99); 13 Mar 2014 08:12:46 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Mar 2014 08:12:46 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of vinayakumar.b@huawei.com designates 119.145.14.65 as permitted sender) Received: from [119.145.14.65] (HELO szxga02-in.huawei.com) (119.145.14.65) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Mar 2014 08:12:38 +0000 Received: from 172.24.2.119 (EHLO szxeml207-edg.china.huawei.com) ([172.24.2.119]) by szxrg02-dlp.huawei.com (MOS 4.3.7-GA FastPath queued) with ESMTP id BRA43847; Thu, 13 Mar 2014 16:12:13 +0800 (CST) Received: from SZXEML424-HUB.china.huawei.com (10.82.67.163) by szxeml207-edg.china.huawei.com (172.24.2.56) with Microsoft SMTP Server (TLS) id 14.3.158.1; Thu, 13 Mar 2014 16:12:10 +0800 Received: from SZXEML523-MBX.china.huawei.com ([169.254.3.125]) by szxeml424-hub.china.huawei.com ([10.82.67.163]) with mapi id 14.03.0158.001; Thu, 13 Mar 2014 16:12:09 +0800 From: Vinayakumar B To: "user@hadoop.apache.org" Subject: RE: Reading files from hdfs directory Thread-Topic: Reading files from hdfs directory Thread-Index: Ac8+j8a34QEw5WqqSGm0zp1JjeOzzQAA74jg Date: Thu, 13 Mar 2014 08:12:08 +0000 Message-ID: <5DF48A23D7B14649BBA72C2F64C6663B82B52333@szxeml523-mbx.china.huawei.com> References: <1B8E0B829A67FE4282FED712E7D58A0009051BEB@ESESSMB305.ericsson.se> In-Reply-To: <1B8E0B829A67FE4282FED712E7D58A0009051BEB@ESESSMB305.ericsson.se> Accept-Language: en-US, zh-CN Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [10.18.170.80] Content-Type: multipart/alternative; boundary="_000_5DF48A23D7B14649BBA72C2F64C6663B82B52333szxeml523mbxchi_" MIME-Version: 1.0 X-CFilter-Loop: Reflected X-Virus-Checked: Checked by ClamAV on apache.org --_000_5DF48A23D7B14649BBA72C2F64C6663B82B52333szxeml523mbxchi_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Hi Satyam, Check whether your Camel client-side configurations are pointing to correct= NameNode(s). What is the deployment ? whether HA/Non-HA? And check whether same exception is present in (Active) NameNode logs. If n= ot then request is going to some other NameNode. Regards, Vinayakumar B. From: Satyam Singh [mailto:satyam.singh@ericsson.com] Sent: 13 March 2014 13:29 To: user@hadoop.apache.org Subject: Reading files from hdfs directory Hello, I want to read files from hdfs remotely through camel-hdfs client. I have made changes in camel-hdfs component for supporting hadoop2.2.0 . I checked file that I want to read, exists on hdfs: [hduser@bl460cx2425 ~]$ hadoop fs -ls /user/hduser/collector/test.txt 14/03/13 09:13:31 WARN util.NativeCodeLoader: Unable to load native-hadoop = library for your platform... using builtin-java classes where applicable Found 1 items -rw-r--r-- 3 root supergroup 1886 2014-03-13 09:13 /user/hduser/col= lector/test.txt But, I get following exception when I am trying to read it through my clien= t from remote machine : 2014-03-13 14:08:25 STDIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteEx= ception(java.io.FileNotFoundException): File does not exist: /user/hduser/c= ollector/test.txt at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFi= le.java:61) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFi= le.java:51) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tionsUpdateTimes(FSNamesystem.java:1499) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tionsInt(FSNamesystem.java:1448) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tions(FSNamesystem.java:1428) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tions(FSNamesystem.java:1402) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBloc= kLocations(NameNodeRpcServer.java:468) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSi= deTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslator= PB.java:269) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProt= os$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos= .java:59566) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoke= r.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Unknown Source) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupIn= formation.java:1491) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Clie= nt.java:1347) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Clie= nt.java:1300) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngin= e$Invoker.invoke(ProtobufRpcEngine.java:206) 2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocatio= ns(Unknown Source) 2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.= invoke(Unknown Source) 2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.DelegatingMethodAccessorIm= pl.invoke(DelegatingMethodAccessorImpl.java:43) 2014-03-13 14:08:25 STDIO [ERROR] at java.lang.reflect.Method.invoke(Method= .java:606) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvoca= tionHandler.invokeMethod(RetryInvocationHandler.java:186) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvoca= tionHandler.invoke(RetryInvocationHandler.java:102) 2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocatio= ns(Unknown Source) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.Clie= ntNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTran= slatorPB.java:188) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callG= etBlockLocations(DFSClient.java:1064) 2014-03-13 14:08:25 STDIO [ERROR] ... 24 more In my client I have ftp files from remote ftp server and put it in hdfs sys= tem in path: /user/hduser/collector. Then we send this file name to our hdf= s file reading client and gives above exception. Prompt help is really appreciated :) BR, Satyam --_000_5DF48A23D7B14649BBA72C2F64C6663B82B52333szxeml523mbxchi_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable

Hi Satyam,<= /span>

 

Check whether your Cam= el client-side configurations are pointing to correct NameNode(s).

 

What is the deployment= ? whether HA/Non-HA?

 

And check whether same= exception is present in (Active) NameNode logs. If not then request is goi= ng to some other  NameNode.

 

Regards,

Vinayakumar B.

From: Satyam S= ingh [mailto:satyam.singh@ericsson.com]
Sent: 13 March 2014 13:29
To: user@hadoop.apache.org
Subject: Reading files from hdfs directory

 

Hello,

 

I want to read files from hdfs remotely through came= l-hdfs client.

I have made changes in camel-hdfs component for supp= orting hadoop2.2.0 .

 

I checked file that I want to read, exists on hdfs:<= o:p>

 

[hduser@bl460cx2425 ~]= $ hadoop fs -ls /user/hduser/collector/test.txt

14/03/13 09:13:31 WARN= util.NativeCodeLoader: Unable to load native-hadoop library for your platf= orm... using builtin-java classes where applicable

Found 1 items

-rw-r--r--  = 3 root supergroup       1886 2014-03-13 09:1= 3 /user/hduser/collector/test.txt

 

 

But, I get following exception when I am trying to r= ead it through my client from remote machine :

 

2014-03-13 14:08:25 ST= DIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNo= tFoundException): File does not exist: /user/hduser/collector/test.txt=

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.INodeFile.va= lueOf(INodeFile.java:61)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.INodeFile.va= lueOf(INodeFile.java:51)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocationsUpdateTimes(FSNamesystem.java:1499)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocationsInt(FSNamesystem.java:1448)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocations(FSNamesystem.java:1428)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocations(FSNamesystem.java:1402)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcS= erver.getBlockLocations(NameNodeRpcServer.java:468)

   &nbs= p;    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodePro= tocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerS= ideTranslatorPB.java:269)

   &nbs= p;    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenod= eProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeP= rotocolProtos.java:59566)

   &nbs= p;    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$Prot= oBufRpcInvoker.call(ProtobufRpcEngine.java:585)

   &nbs= p;    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)=

   &nbs= p;    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.j= ava:2048)

   &nbs= p;    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.j= ava:2044)

   &nbs= p;    at java.security.AccessController.doPrivileged(Native = Method)

   &nbs= p;    at javax.security.auth.Subject.doAs(Unknown Source)

   &nbs= p;    at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1491)

   &nbs= p;    at org.apache.hadoop.ipc.Server$Handler.run(Server.jav= a:2042)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1347)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1300)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(Proto= bufRpcEngine.java:206)

2014-03-13 14:08:25 ST= DIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)

2014-03-13 14:08:25 ST= DIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)=

2014-03-13 14:08:25 ST= DIO [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)

2014-03-13 14:08:25 ST= DIO [ERROR] at java.lang.reflect.Method.invoke(Method.java:606)<= /span>

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMeth= od(RetryInvocationHandler.java:186)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(Ret= ryInvocationHandler.java:102)

2014-03-13 14:08:25 ST= DIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTran= slatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:188)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSCl= ient.java:1064)

2014-03-13 14:08:25 ST= DIO [ERROR] ... 24 more

 

 

In my client I have ftp files from remote ftp server= and put it in hdfs system in path: /user/hduser/collector. Then we send th= is file name to our hdfs file reading client and gives above exception.

 

 

Prompt help is really = appreciated J

 

BR,<= /p>

Satyam

 

 

 

 

 

--_000_5DF48A23D7B14649BBA72C2F64C6663B82B52333szxeml523mbxchi_--