Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 446049102 for ; Mon, 22 Dec 2014 10:05:33 +0000 (UTC) Received: (qmail 51426 invoked by uid 500); 22 Dec 2014 10:05:27 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 51299 invoked by uid 500); 22 Dec 2014 10:05:27 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 51289 invoked by uid 99); 22 Dec 2014 10:05:26 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 22 Dec 2014 10:05:26 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of unmeshabiju@gmail.com designates 209.85.223.178 as permitted sender) Received: from [209.85.223.178] (HELO mail-ie0-f178.google.com) (209.85.223.178) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 22 Dec 2014 10:05:22 +0000 Received: by mail-ie0-f178.google.com with SMTP id vy18so2237275iec.23 for ; Mon, 22 Dec 2014 02:04:16 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :content-type; bh=nDhG1xM9KL2YxRmQhOksje5LBohFRDoAOVr4yR7IhnQ=; b=zMowgfIp1a0n2Y+TWfQx5R2Fm8TKH0sgNwxxf2ZJ6/b3mCoLDw6ctpeLIoXiUrVUTi 7e2A64NjrduYIn5vi6JkHS+Lb3ysL/evm+MMbm5n25NqUdRCPpYeWVQ/e1NUhExSKJDi c9FFU+Q5qfEbep/MuyW6OELpiMjbtlq2suIzUni+BcXb780LgHYETvJ9mDdlR/DXcYIY yK1Yhl55oCsyRV3AxdOlZ22a/fh9yB+d7yc4A68XsAGzor3K5Y7705t4hMgN1PFrQM74 7ukt73Cr4bJbnDmwiBPkoZFpL39wvTz2Hnml1QOOVinldWux3bHTZl6/d8h7smtuEQAq itwA== X-Received: by 10.50.67.18 with SMTP id j18mr15035987igt.26.1419242656729; Mon, 22 Dec 2014 02:04:16 -0800 (PST) MIME-Version: 1.0 Received: by 10.107.134.145 with HTTP; Mon, 22 Dec 2014 02:03:36 -0800 (PST) In-Reply-To: <5497E45B.2040407@nissatech.com> References: <5497E45B.2040407@nissatech.com> From: unmesha sreeveni Date: Mon, 22 Dec 2014 15:33:36 +0530 Message-ID: Subject: Re: FileNotFoundException in distributed mode To: User Hadoop Content-Type: multipart/alternative; boundary=047d7bdc053267920b050acb2e8b X-Virus-Checked: Checked by ClamAV on apache.org --047d7bdc053267920b050acb2e8b Content-Type: text/plain; charset=UTF-8 Driver Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); Path cachefile = new Path("path/to/file"); FileStatus[] list = fs.globStatus(cachefile); for (FileStatus status : list) { DistributedCache.addCacheFile(status.getPath().toUri(), conf); } In setup public void setup(Context context) throws IOException{ Configuration conf = context.getConfiguration(); FileSystem fs = FileSystem.get(conf); URI[] cacheFiles = DistributedCache.getCacheFiles(conf); Path getPath = new Path(cacheFiles[0].getPath()); BufferedReader bf = new BufferedReader(new InputStreamReader(fs.open(getPath))); String setupData = null; while ((setupData = bf.readLine()) != null) { System.out.println("Setup Line in reducer "+setupData); } } Hope this link helps: http://unmeshasreeveni.blogspot.in/2014/10/how-to-load-file-in-distributedcache-in.html On Mon, Dec 22, 2014 at 2:58 PM, Marko Dinic wrote: > Hello Hadoopers, > > I'm getting this exception in Hadoop while trying to read file that was > added to distributed cache, and the strange thing is that the file exists > on the given location > > java.io.FileNotFoundException: File does not exist: > /tmp/hadoop-pera/mapred/local/taskTracker/distcache/-1517670662102870873_- > 1918892372_1898431787/localhost/work/output/temporalcentroids/centroids- > iteration0-noOfClusters2/part-r-00000 > > I'm adding the file in before starting my job using > > DistributedCache.addCacheFile(URI.create(args[2]), > job.getConfiguration()); > > And I'm trying to read from the file from setup metod in my mapper using > > DistributedCache.getLocalCacheFiles(conf); > > As I said, I can confirm that the file is on the local system, but the > exception is thrown. > > I'm running the job in pseudo-distributed mode, on one computer. > > Any ideas? > > Thanks > -- *Thanks & Regards * *Unmesha Sreeveni U.B* *Hadoop, Bigdata Developer* *Centre for Cyber Security | Amrita Vishwa Vidyapeetham* http://www.unmeshasreeveni.blogspot.in/ --047d7bdc053267920b050acb2e8b Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Driver

Configuration c=
onf =3D new Configuration();
FileSystem fs =3D FileSystem.get(conf);
Path cachefile =3D new Path("path/to/file");
FileStatus[] list =3D fs.globStatus(cachefile);
for (FileStatus status : list) {
 DistributedCache.addCacheFile(status.getPath().toUri(), conf);
}
In setup
public v=
oid setup(Context context) throws IOException{
 Configuration conf =3D context.getConfiguration();
 FileSystem fs =3D FileSystem.get(conf);
 URI[] cacheFiles =3D DistributedCache.getCacheFiles(conf);
 Path getPath =3D new Path(cacheFiles[0].getPath()); =20
 BufferedReader bf =3D new BufferedReader(new InputStreamReader(fs.open(get=
Path)));
 String setupData =3D null;
 while ((setupData =3D bf.readLine()) !=3D null) {
   System.out.println("Setup Line in reducer "+setupData);
 }
}


On Mon, Dec 22, 2014 at 2:58 PM, Marko Dinic <mar= ko.dinic@nissatech.com> wrote:
Hello Hadoopers,

I'm getting this exception in Hadoop while trying to read file that was= added to distributed cache, and the strange thing is that the file exists = on the given location

=C2=A0 =C2=A0 java.io.FileNotFoundException: File does not exist: /tmp/hado= op-pera/mapred/local/taskTracker/distcache/-15176706621028708= 73_-1918892372_1898431787/localhost/work/output/tempor= alcentroids/centroids-iteration0-noOfClusters2/part-r-00000
I'm adding the file in before starting my job using

=C2=A0 =C2=A0 DistributedCache.addCacheFile(URI.create(args[2]), job= .getConfiguration());

And I'm trying to read from the file from setup metod in my mapper usin= g

=C2=A0 =C2=A0 DistributedCache.getLocalCacheFiles(conf);

As I said, I can confirm that the file is on the local system, but the exce= ption is thrown.

I'm running the job in pseudo-distributed mode, on one computer.

Any ideas?

Thanks



--
Thanks & Regards

Unmesha Sreeveni U.B
Hadoop, Bigdata Developer
=
Centre for Cyber Security | Amrita Vishwa V= idyapeetham
=

--047d7bdc053267920b050acb2e8b--