hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Miklos Szegedi (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HADOOP-15331) Race conditions caused by org.apache.hadoop.conf.Configuration: error parsing conf java.io.BufferedInputStream
Date Wed, 21 Mar 2018 01:16:00 GMT
Miklos Szegedi created HADOOP-15331:
---------------------------------------

             Summary: Race conditions caused by org.apache.hadoop.conf.Configuration: error
parsing conf java.io.BufferedInputStream
                 Key: HADOOP-15331
                 URL: https://issues.apache.org/jira/browse/HADOOP-15331
             Project: Hadoop Common
          Issue Type: Bug
          Components: common
    Affects Versions: 3.0.0, 2.7.5, 2.8.3, 2.9.0, 3.1.0, 2.10.0
            Reporter: Miklos Szegedi
            Assignee: Miklos Szegedi


There is a race condition in the way Hadoop handles the Configuration class. The scenario
is the following. Let's assume that there are two threads sharing the same Configuration class.
One adds some resources to the configuration, while the other one clones it. Resources are
loaded lazily in a deferred call to {{loadResources()}}. If the cloning happens after adding
the resources but before parsing them, some temporary resources like input stream pointers
are cloned. Eventually both copies will load the input stream resources pointing to the same
input streams. One parses the input stream XML and closes it updating it's own copy of the
resource. The other one has another pointer to the same input stream. When it tries to load
it, it will crash with a stream closed exception.

Here is an example unit test:
{code:java}
@Test
public void testResourceRace() throws Exception {
  InputStream is = new InputStream() {
    InputStream is =
        new ByteArrayInputStream(
            "<configuration></configuration>".getBytes());
    @Override
    public int read() throws IOException {
      return is.read();
    }
  };
  Configuration conf = new Configuration();
  conf.addResource(new BufferedInputStream(is));
  Configuration confClone = new Configuration(conf);
  confClone.get("firstParse");
  conf.get("secondParse");
}{code}
Example real world stack traces:
{code:java}
2018-02-28 08:23:19,589 ERROR org.apache.hadoop.conf.Configuration: error parsing conf java.io.BufferedInputStream@7741d346
com.ctc.wstx.exc.WstxIOException: Stream closed
	at com.ctc.wstx.stax.WstxInputFactory.doCreateSR(WstxInputFactory.java:578)
	at com.ctc.wstx.stax.WstxInputFactory.createSR(WstxInputFactory.java:633)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2803)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2853)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2817)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2689)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1420)
	at org.apache.hadoop.security.authorize.ServiceAuthorizationManager.refreshWithLoadedConfiguration(ServiceAuthorizationManager.java:161)
	at org.apache.hadoop.ipc.Server.refreshServiceAclWithLoadedConfiguration(Server.java:607)
	at org.apache.hadoop.yarn.server.resourcemanager.AdminService.refreshServiceAcls(AdminService.java:586)
	at org.apache.hadoop.yarn.server.resourcemanager.AdminService.startServer(AdminService.java:188)
	at org.apache.hadoop.yarn.server.resourcemanager.AdminService.serviceStart(AdminService.java:165)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:121)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1231)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1421)
{code}
Another example:
{code:java}
2018-02-28 08:23:20,702 ERROR org.apache.hadoop.conf.Configuration: error parsing conf java.io.BufferedInputStream@7741d346
com.ctc.wstx.exc.WstxIOException: Stream closed
	at com.ctc.wstx.stax.WstxInputFactory.doCreateSR(WstxInputFactory.java:578)
	at com.ctc.wstx.stax.WstxInputFactory.createSR(WstxInputFactory.java:633)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2803)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2853)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2817)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2689)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1326)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1298)
	at org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebApp.buildRedirectPath(RMWebApp.java:103)
	at org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebApp.getRedirectPath(RMWebApp.java:91)
	at org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebAppFilter.doFilter(RMWebAppFilter.java:125)
	at com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:829)
	at com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:82)
	at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)
	at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)
	at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)
	at com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)
	at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.security.http.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:57)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter.doFilter(StaticUserWebFilter.java:110)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1560)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
	at org.eclipse.jetty.server.Server.handle(Server.java:534)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108)
	at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.execute(ExecuteProduceConsume.java:100)
	at org.eclipse.jetty.io.ManagedSelector.run(ManagedSelector.java:147)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Stream closed
	at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:336)
	at com.ctc.wstx.io.StreamBootstrapper.ensureLoaded(StreamBootstrapper.java:482)
	at com.ctc.wstx.io.StreamBootstrapper.resolveStreamEncoding(StreamBootstrapper.java:306)
	at com.ctc.wstx.io.StreamBootstrapper.bootstrapInput(StreamBootstrapper.java:167)
	at com.ctc.wstx.stax.WstxInputFactory.doCreateSR(WstxInputFactory.java:573)
	... 50 more
2018-02-28 08:23:20,705 WARN org.eclipse.jetty.servlet.ServletHandler: /jmx
java.lang.RuntimeException: com.ctc.wstx.exc.WstxIOException: Stream closed
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3048)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2817)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2689)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1326)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1298)
	at org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebApp.buildRedirectPath(RMWebApp.java:103)
	at org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebApp.getRedirectPath(RMWebApp.java:91)
	at org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebAppFilter.doFilter(RMWebAppFilter.java:125)
	at com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:829)
	at com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)
	at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)
	at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)
	at com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)
	at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.security.http.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:57)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter.doFilter(StaticUserWebFilter.java:110)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1560)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
	at org.eclipse.jetty.server.Server.handle(Server.java:534)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108)
	at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.execute(ExecuteProduceConsume.java:100)
	at org.eclipse.jetty.io.ManagedSelector.run(ManagedSelector.java:147)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
	at java.lang.Thread.run(Thread.java:748)
Caused by: com.ctc.wstx.exc.WstxIOException: Stream closed
	at com.ctc.wstx.stax.WstxInputFactory.doCreateSR(WstxInputFactory.java:578)
	at com.ctc.wstx.stax.WstxInputFactory.createSR(WstxInputFactory.java:633)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2803)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2853)
	... 46 more
Caused by: java.io.IOException: Stream closed
	at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:336)
	at com.ctc.wstx.io.StreamBootstrapper.ensureLoaded(StreamBootstrapper.java:482)
	at com.ctc.wstx.io.StreamBootstrapper.resolveStreamEncoding(StreamBootstrapper.java:306)
	at com.ctc.wstx.io.StreamBootstrapper.bootstrapInput(StreamBootstrapper.java:167)
	at com.ctc.wstx.stax.WstxInputFactory.doCreateSR(WstxInputFactory.java:573)
	... 49 more
2018-02-28 08:23:20,715 INFO org.{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


Mime
View raw message