Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1AF3A10B96 for ; Thu, 13 Mar 2014 07:59:58 +0000 (UTC) Received: (qmail 71202 invoked by uid 500); 13 Mar 2014 07:59:50 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 70523 invoked by uid 500); 13 Mar 2014 07:59:46 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 70490 invoked by uid 99); 13 Mar 2014 07:59:43 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Mar 2014 07:59:43 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of satyam.singh@ericsson.com designates 193.180.251.56 as permitted sender) Received: from [193.180.251.56] (HELO sesbmg20.ericsson.net) (193.180.251.56) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 13 Mar 2014 07:59:35 +0000 X-AuditID: c1b4fb38-b7f418e000001099-7a-53216551c9df Received: from ESESSHC003.ericsson.se (Unknown_Domain [153.88.253.124]) by sesbmg20.ericsson.net (Symantec Mail Security) with SMTP id 0C.17.04249.15561235; Thu, 13 Mar 2014 08:59:14 +0100 (CET) Received: from ESESSMB305.ericsson.se ([169.254.5.96]) by ESESSHC003.ericsson.se ([153.88.183.27]) with mapi id 14.02.0387.000; Thu, 13 Mar 2014 08:59:13 +0100 From: Satyam Singh To: "user@hadoop.apache.org" Subject: Reading files from hdfs directory Thread-Topic: Reading files from hdfs directory Thread-Index: Ac8+j8a34QEw5WqqSGm0zp1JjeOzzQ== Date: Thu, 13 Mar 2014 07:59:13 +0000 Message-ID: <1B8E0B829A67FE4282FED712E7D58A0009051BEB@ESESSMB305.ericsson.se> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [153.88.183.147] Content-Type: multipart/alternative; boundary="_000_1B8E0B829A67FE4282FED712E7D58A0009051BEBESESSMB305erics_" MIME-Version: 1.0 X-Brightmail-Tracker: H4sIAAAAAAAAA+NgFtrMLMWRmVeSWpSXmKPExsUyM+JvjW5QqmKwwY5JbBY9U6axODB6TOja whjAGMVlk5Kak1mWWqRvl8CV8XzNT/aCBZUVDU8EGxgn5nUxcnJICJhI/N95mQXCFpO4cG89 WxcjF4eQwBFGiZPr90M5ixklera+Zepi5OBgE9CTuDZfCcQUETCV6HmqC9IrLKAlse3nBRaI sL5E7zEdkLAIUPGZ88fYQGwWAVWJR0sPMIPYvAK+EteOnWIFsRmB1n4/tYYJxGYWEJe49WQ+ E8Q5AhJL9pxnhrBFJV4+/scKMl5CQEli2tY0iPJ8iaNvT7NDjBSUODnzCcsERqFZSCbNQlI2 C0kZRFxHYsHuT2wQtrbEsoWvmWHsMwceMyGLL2BkX8XIUZxanJSbbmSwiREY8Ae3/LbYwXj5 r80hRmkOFiVx3o9vnYOEBNITS1KzU1MLUovii0pzUosPMTJxcEo1MGaIKHevMp/Sb7SzIMfk yG5pAenOf9nVXzJkDdXYp5j5/bNQ7BBZ4/XcqjzplrdQ3R3NTmFOxwvOl3oN4yO99xZbrHe2 SG3JXzH769Vi3RsH1xp9Ndl/73nAJckT5u+/Kcy/9/9A+a7XHh6TNmdkGla3B7PNO8SmFypo 3eXbuGP54pta97/oKLEUZyQaajEXFScCAJajFAVGAgAA X-Virus-Checked: Checked by ClamAV on apache.org --_000_1B8E0B829A67FE4282FED712E7D58A0009051BEBESESSMB305erics_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Hello, I want to read files from hdfs remotely through camel-hdfs client. I have made changes in camel-hdfs component for supporting hadoop2.2.0 . I checked file that I want to read, exists on hdfs: [hduser@bl460cx2425 ~]$ hadoop fs -ls /user/hduser/collector/test.txt 14/03/13 09:13:31 WARN util.NativeCodeLoader: Unable to load native-hadoop = library for your platform... using builtin-java classes where applicable Found 1 items -rw-r--r-- 3 root supergroup 1886 2014-03-13 09:13 /user/hduser/col= lector/test.txt But, I get following exception when I am trying to read it through my clien= t from remote machine : 2014-03-13 14:08:25 STDIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteEx= ception(java.io.FileNotFoundException): File does not exist: /user/hduser/c= ollector/test.txt at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFi= le.java:61) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFi= le.java:51) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tionsUpdateTimes(FSNamesystem.java:1499) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tionsInt(FSNamesystem.java:1448) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tions(FSNamesystem.java:1428) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLoca= tions(FSNamesystem.java:1402) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBloc= kLocations(NameNodeRpcServer.java:468) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSi= deTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslator= PB.java:269) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProt= os$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos= .java:59566) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoke= r.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Unknown Source) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupIn= formation.java:1491) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Clie= nt.java:1347) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Clie= nt.java:1300) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngin= e$Invoker.invoke(ProtobufRpcEngine.java:206) 2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocatio= ns(Unknown Source) 2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.= invoke(Unknown Source) 2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.DelegatingMethodAccessorIm= pl.invoke(DelegatingMethodAccessorImpl.java:43) 2014-03-13 14:08:25 STDIO [ERROR] at java.lang.reflect.Method.invoke(Method= .java:606) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvoca= tionHandler.invokeMethod(RetryInvocationHandler.java:186) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvoca= tionHandler.invoke(RetryInvocationHandler.java:102) 2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocatio= ns(Unknown Source) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.Clie= ntNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTran= slatorPB.java:188) 2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callG= etBlockLocations(DFSClient.java:1064) 2014-03-13 14:08:25 STDIO [ERROR] ... 24 more In my client I have ftp files from remote ftp server and put it in hdfs sys= tem in path: /user/hduser/collector. Then we send this file name to our hdf= s file reading client and gives above exception. Prompt help is really appreciated :) BR, Satyam --_000_1B8E0B829A67FE4282FED712E7D58A0009051BEBESESSMB305erics_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable

Hello,

 

I want to read files from hdfs remotely through came= l-hdfs client.

I have made changes in camel-hdfs component for supp= orting hadoop2.2.0 .

 

I checked file that I want to read, exists on hdfs:<= o:p>

 

[hduser@bl460cx2425 ~]= $ hadoop fs -ls /user/hduser/collector/test.txt

14/03/13 09:13:31 WARN= util.NativeCodeLoader: Unable to load native-hadoop library for your platf= orm... using builtin-java classes where applicable

Found 1 items

-rw-r--r--  = 3 root supergroup       1886 2014-03-13 09:1= 3 /user/hduser/collector/test.txt

 

 

But, I get following exception when I am trying to r= ead it through my client from remote machine :

 

2014-03-13 14:08:25 ST= DIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNo= tFoundException): File does not exist: /user/hduser/collector/test.txt=

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.INodeFile.va= lueOf(INodeFile.java:61)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.INodeFile.va= lueOf(INodeFile.java:51)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocationsUpdateTimes(FSNamesystem.java:1499)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocationsInt(FSNamesystem.java:1448)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocations(FSNamesystem.java:1428)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem= .getBlockLocations(FSNamesystem.java:1402)

   &nbs= p;    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcS= erver.getBlockLocations(NameNodeRpcServer.java:468)

   &nbs= p;    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodePro= tocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerS= ideTranslatorPB.java:269)

   &nbs= p;    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenod= eProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeP= rotocolProtos.java:59566)

   &nbs= p;    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$Prot= oBufRpcInvoker.call(ProtobufRpcEngine.java:585)

   &nbs= p;    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)=

   &nbs= p;    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.j= ava:2048)

   &nbs= p;    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.j= ava:2044)

   &nbs= p;    at java.security.AccessController.doPrivileged(Native = Method)

   &nbs= p;    at javax.security.auth.Subject.doAs(Unknown Source)

   &nbs= p;    at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1491)

   &nbs= p;    at org.apache.hadoop.ipc.Server$Handler.run(Server.jav= a:2042)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1347)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1300)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(Proto= bufRpcEngine.java:206)

2014-03-13 14:08:25 ST= DIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)

2014-03-13 14:08:25 ST= DIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)=

2014-03-13 14:08:25 ST= DIO [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe= thodAccessorImpl.java:43)

2014-03-13 14:08:25 ST= DIO [ERROR] at java.lang.reflect.Method.invoke(Method.java:606)<= /span>

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMeth= od(RetryInvocationHandler.java:186)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(Ret= ryInvocationHandler.java:102)

2014-03-13 14:08:25 ST= DIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTran= slatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:188)

2014-03-13 14:08:25 ST= DIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSCl= ient.java:1064)

2014-03-13 14:08:25 ST= DIO [ERROR] ... 24 more

 

 

In my client I have ftp files from remote ftp server= and put it in hdfs system in path: /user/hduser/collector. Then we send th= is file name to our hdfs file reading client and gives above exception.

 

 

Prompt help is really = appreciated J

 

BR,<= /p>

Satyam

 

 

 

 

 

--_000_1B8E0B829A67FE4282FED712E7D58A0009051BEBESESSMB305erics_--