hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shrijeet Paliwal <shrij...@rocketfuel.com>
Subject Re: Loading conf file sitting in HDFS
Date Fri, 29 Oct 2010 20:16:48 GMT
Yeah addResource with InputStream as input is the friend.

I got trapped by the notion that since its taking Path as an argument, it
will respect the scheme specified in the path.
But if you look into the implementation of loadResource

// Can't use FileSystem API or we get an infinite loop
        // since FileSystem uses Configuration API.  Use java.io.File
instead.
        File file = new
File(((Path)name).toUri().getPath()).getAbsoluteFile();

This guy just looks at path component.

On Fri, Oct 29, 2010 at 12:54 PM, Segel, Mike <msegel@navteq.com> wrote:

> Insights?
>
> I'm sure someone will correct me...
>
> What I saw was the overloaded method addResource().
>
> Looking at the various methods' input:
> addResource(String name) := name - resource to be added, the classpath is
> examined for a file with that name.
> addResource(URL url) := url - url of the resource to be added, the local
> filesystem is examined directly to find the resource, without referring to
> the classpath.
> addResource(Path file) := file - file-path of resource to be added, the
> local filesystem is examined directly to find the resource, without
> referring to the classpath.
>
> And then:
>
> addResource(InputStream in) := in - InputStream to deserialize the object
> from.
>
> The first 3 specifically are looking for a local file. (Unix file system).
>
> The last one take a generic InputStream which could come from a Unix file
> or an HDFS file.
>
> It looks like the initial intention of loading configuration information
> was from Unix and then HDFS was an afterthought. (My Guess).
>
> HTH
>
> -Mike
>
> -----Original Message-----
> From: Shrijeet Paliwal [mailto:shrijeet@rocketfuel.com]
> Sent: Friday, October 29, 2010 2:39 PM
> To: general@hadoop.apache.org; mapreduce-user@hadoop.apache.org
> Subject: Re: Loading conf file sitting in HDFS
>
> I got it working by doing this (passing inputstream):
>
> conf.addResource(hdfs.open(apath));
>
> Would still be interested in knowing insights if any one wants to share.
>
> -Shrijeet
>
> On Fri, Oct 29, 2010 at 12:18 PM, Shrijeet Paliwal
> <shrijeet@rocketfuel.com>wrote:
>
> > Hello All,
> > I am trying to load a conf file located in hdfs.
> > I was hoping following would work:
> >
> > Path apath = new Path(conf.get("fs.default.name"),
> > "/home/shrijeet/blah.xml");
> > conf.addResource(apath);
> >
> > But I get following exception,
> >
> > Exception in thread "main" java.lang.RuntimeException:
> > hdfs://localhost:8020/home/shrijeet/blah.xml not found
> >         at
> >
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1297)
> >         at
> >
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1227)
> >
> >
> > File does exist however,
> >
> > [12:20][shrijeet@shrijeet-desktop]~$ hadoop dfs -ls
> > hdfs://localhost:8020/home/shrijeet/blah.xml
> > Found 1 items
> > -rw-r--r--   1 shrijeet shrijeet        488 2010-10-29 12:16
> > /home/shrijeet/blah.xml
> >
> > Thoughts?
> >
> > -Shrijeet
> >
>
>
> The information contained in this communication may be CONFIDENTIAL and is
> intended only for the use of the recipient(s) named above.  If you are not
> the intended recipient, you are hereby notified that any dissemination,
> distribution, or copying of this communication, or any of its contents, is
> strictly prohibited.  If you have received this communication in error,
> please notify the sender and delete/destroy the original message and any
> copy of it from your computer or paper files.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message