hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Todd Lipcon (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-8031) Configuration class fails to find embedded .jar resources; should use URL.openStream()
Date Thu, 30 Aug 2012 01:00:09 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-8031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444582#comment-13444582

Todd Lipcon commented on HADOOP-8031:

Hey Tucu. I think this commit broke the way in which relative xincludes are handled in Configuration.
I have some development confs which use xinclude with non-absolute paths, and it used to successfully
pick up the included files from my conf directory. Now, it seems to be looking in the current
working directory instead.

Is it possible to fix the code so that the relative paths are resolved the same as before?
I think xinclude is relatively common for deployments.
> Configuration class fails to find embedded .jar resources; should use URL.openStream()
> --------------------------------------------------------------------------------------
>                 Key: HADOOP-8031
>                 URL: https://issues.apache.org/jira/browse/HADOOP-8031
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: conf
>    Affects Versions: 2.0.0-alpha
>            Reporter: Elias Ross
>            Assignee: Elias Ross
>             Fix For: 2.2.0-alpha
>         Attachments: 0001-fix-HADOOP-7982-class-loader.patch, HADOOP-8031.patch, hadoop-8031.txt
> While running a hadoop client within RHQ (monitoring software) using its classloader,
I see this:
> 2012-02-07 09:15:25,313 INFO  [ResourceContainer.invoker.daemon-2] (org.apache.hadoop.conf.Configuration)-
parsing jar:file:/usr/local/rhq-agent/data/tmp/rhq-hadoop-plugin-4.3.0-SNAPSHOT.jar6856622641102893436.classloader/hadoop-core-0.20.2+737+1.jar7204287718482036191.tmp!/core-default.xml
> 2012-02-07 09:15:25,318 ERROR [InventoryManager.discovery-1] (rhq.core.pc.inventory.InventoryManager)-
Failed to start component for Resource[id=16290, type=NameNode, key=NameNode:/usr/lib/hadoop-0.20,
name=NameNode, parent=vg61l01ad-hadoop002.apple.com] from synchronized merge.
> org.rhq.core.clientapi.agent.PluginContainerException: Failed to start component for
resource Resource[id=16290, type=NameNode, key=NameNode:/usr/lib/hadoop-0.20, name=NameNode,
> Caused by: java.lang.RuntimeException: core-site.xml not found
> 	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1308)
> 	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1228)
> 	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1169)
> 	at org.apache.hadoop.conf.Configuration.set(Configuration.java:438)
> This is because the URL
> jar:file:/usr/local/rhq-agent/data/tmp/rhq-hadoop-plugin-4.3.0-SNAPSHOT.jar6856622641102893436.classloader/hadoop-core-0.20.2+737+1.jar7204287718482036191.tmp!/core-default.xml
> cannot be found by DocumentBuilder (doesn't understand it). (Note: the logs are for an
old version of Configuration class, but the new version has the same code.)
> The solution is to obtain the resource stream directly from the URL object itself.
> That is to say:
> {code}
>          URL url = getResource((String)name);
> -        if (url != null) {
> -          if (!quiet) {
> -            LOG.info("parsing " + url);
> -          }
> -          doc = builder.parse(url.toString());
> -        }
> +        doc = builder.parse(url.openStream());
> {code}
> Note: I have a full patch pending approval at Apple for this change, including some cleanup.

This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

View raw message