beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Aviem Zur (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (BEAM-2719) Beam job hangs at Evaluating ParMultiDo when submitted via spark-runner
Date Wed, 06 Sep 2017 13:07:03 GMT

    [ https://issues.apache.org/jira/browse/BEAM-2719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16155300#comment-16155300
] 

Aviem Zur commented on BEAM-2719:
---------------------------------

Problem was identified, the job here was being run on a Spark 2.1.1 cluster while we currently
only support Spark 1.6.3
See discussion on user list here: https://lists.apache.org/thread.html/132c2e43d35fc546396fd97382c338066d5d2f629028f00e965eeb7a@%3Cuser.beam.apache.org%3E

> Beam job hangs at Evaluating ParMultiDo when submitted via spark-runner 
> ------------------------------------------------------------------------
>
>                 Key: BEAM-2719
>                 URL: https://issues.apache.org/jira/browse/BEAM-2719
>             Project: Beam
>          Issue Type: Bug
>          Components: runner-spark
>    Affects Versions: 2.0.0
>         Environment: OSX / i5 / 10GB
>            Reporter: Sathish Jayaraman
>            Assignee: Jean-Baptiste Onofré
>
> The Beam job submitted for execution via spark-submit does not get past the Evaluating
ParMultiDo step. The compile execution runs fine when given --runner=SparkRunner as parameter.
But if I bundle the jar & submit it using spark-submit, there were no executors getting
assigned. I tried to submit with both master spark-url & YARN but no luck in getting it
executed past that step. 
> I tried executing in both local single node cluster & in Azure HDInsight cluster,
the result is the same. So I guess there is nothing wrong in the Spark configuration &
could be a bug. 
> Below is the command I used to submit, the console log & job log from YARN. 
> *Command:*
> {code}
> $ ~/spark/bin/spark-submit --class org.apache.beam.examples.WordCount --master yarn --executor-memory
2G --num-executors 2 target/word-count-beam-0.1-shaded.jar --runner=SparkRunner --inputFile=pom.xml
--output=counts
> {code}
> *Terminal log*
> {code}
> 17/08/03 15:18:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for
your platform... using builtin-java classes where applicable
> 17/08/03 15:18:16 INFO spark.SparkRunner: Executing pipeline using the SparkRunner.
> 17/08/03 15:18:16 INFO translation.SparkContextFactory: Creating a brand new Spark Context.
> 17/08/03 15:18:16 INFO spark.SparkContext: Running Spark version 2.1.1
> 17/08/03 15:18:17 INFO spark.SecurityManager: Changing view acls to: sathishjayaraman
> 17/08/03 15:18:17 INFO spark.SecurityManager: Changing modify acls to: sathishjayaraman
> 17/08/03 15:18:17 INFO spark.SecurityManager: Changing view acls groups to: 
> 17/08/03 15:18:17 INFO spark.SecurityManager: Changing modify acls groups to: 
> 17/08/03 15:18:17 INFO spark.SecurityManager: SecurityManager: authentication disabled;
ui acls disabled; users  with view permissions: Set(sathishjayaraman); groups with view permissions:
Set(); users  with modify permissions: Set(sathishjayaraman); groups with modify permissions:
Set()
> 17/08/03 15:18:17 INFO util.Utils: Successfully started service 'sparkDriver' on port
51207.
> 17/08/03 15:18:17 INFO spark.SparkEnv: Registering MapOutputTracker
> 17/08/03 15:18:17 INFO spark.SparkEnv: Registering BlockManagerMaster
> 17/08/03 15:18:17 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper
for getting topology information
> 17/08/03 15:18:17 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint
up
> 17/08/03 15:18:17 INFO storage.DiskBlockManager: Created local directory at /private/var/folders/d3/d1mrkc4s023d3qv1jr6w4cg00000gp/T/blockmgr-92d9827e-49e0-4035-b206-6fb4c24aa34c
> 17/08/03 15:18:17 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
> 17/08/03 15:18:17 INFO spark.SparkEnv: Registering OutputCommitCoordinator
> 17/08/03 15:18:18 INFO util.log: Logging initialized @5489ms
> 17/08/03 15:18:18 INFO server.Server: jetty-9.2.z-SNAPSHOT
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2525a5b8{/jobs,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3458eca5{/jobs/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e0fdb2f{/jobs/job,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c380bd8{/jobs/job/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@34b87182{/stages,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@47768e74{/stages/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2740e316{/stages/stage,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b5a4aed{/stages/stage/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c991465{/stages/pool,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fef2aac{/stages/pool/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7f973a14{/storage,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@76130a29{/storage/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@124d02b2{/storage/rdd,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3451f01d{/storage/rdd/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@72503b19{/environment,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1cfc2538{/environment/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66deec87{/executors,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5d342959{/executors/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2ecf5915{/executors/threadDump,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53e76c11{/executors/threadDump/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@42cc183e{/static,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e4e8fdf{/,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a1d6ef2{/api,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2721044{/jobs/job/kill,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@205df5dc{/stages/stage/kill,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 WARN component.AbstractLifeCycle: FAILED Spark@1756f7cc{HTTP/1.1}{0.0.0.0:4040}:
java.net.BindException: Address already in use
> java.net.BindException: Address already in use
> 	at sun.nio.ch.Net.bind0(Native Method)
> 	at sun.nio.ch.Net.bind(Net.java:433)
> 	at sun.nio.ch.Net.bind(Net.java:425)
> 	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
> 	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
> 	at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321)
> 	at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
> 	at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.spark_project.jetty.server.Server.doStart(Server.java:366)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:365)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:375)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:375)
> 	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2213)
> 	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
> 	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2204)
> 	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:375)
> 	at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:460)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:460)
> 	at scala.Option.foreach(Option.scala:257)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
> 	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
> 	at org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:100)
> 	at org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:69)
> 	at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:195)
> 	at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:85)
> 	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:295)
> 	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:281)
> 	at org.apache.beam.examples.WordCount.main(WordCount.java:184)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 17/08/03 15:18:18 WARN component.AbstractLifeCycle: FAILED org.spark_project.jetty.server.Server@506aabf6:
java.net.BindException: Address already in use
> java.net.BindException: Address already in use
> 	at sun.nio.ch.Net.bind0(Native Method)
> 	at sun.nio.ch.Net.bind(Net.java:433)
> 	at sun.nio.ch.Net.bind(Net.java:425)
> 	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
> 	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
> 	at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321)
> 	at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
> 	at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.spark_project.jetty.server.Server.doStart(Server.java:366)
> 	at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
> 	at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:365)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:375)
> 	at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:375)
> 	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2213)
> 	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
> 	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2204)
> 	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:375)
> 	at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:460)
> 	at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:460)
> 	at scala.Option.foreach(Option.scala:257)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
> 	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
> 	at org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:100)
> 	at org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:69)
> 	at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:195)
> 	at org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:85)
> 	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:295)
> 	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:281)
> 	at org.apache.beam.examples.WordCount.main(WordCount.java:184)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 17/08/03 15:18:18 INFO server.ServerConnector: Stopped Spark@1756f7cc{HTTP/1.1}{0.0.0.0:4040}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@205df5dc{/stages/stage/kill,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2721044{/jobs/job/kill,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6a1d6ef2{/api,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3e4e8fdf{/,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@42cc183e{/static,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@53e76c11{/executors/threadDump/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2ecf5915{/executors/threadDump,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5d342959{/executors/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@66deec87{/executors,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1cfc2538{/environment/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@72503b19{/environment,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3451f01d{/storage/rdd/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@124d02b2{/storage/rdd,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@76130a29{/storage/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7f973a14{/storage,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5fef2aac{/stages/pool/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2c991465{/stages/pool,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5b5a4aed{/stages/stage/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2740e316{/stages/stage,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@47768e74{/stages/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@34b87182{/stages,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3c380bd8{/jobs/job/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1e0fdb2f{/jobs/job,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3458eca5{/jobs/json,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2525a5b8{/jobs,null,UNAVAILABLE,@Spark}
> 17/08/03 15:18:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting
port 4041.
> 17/08/03 15:18:18 INFO server.Server: jetty-9.2.z-SNAPSHOT
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2525a5b8{/jobs,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3458eca5{/jobs/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e0fdb2f{/jobs/job,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c380bd8{/jobs/job/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@34b87182{/stages,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@47768e74{/stages/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2740e316{/stages/stage,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b5a4aed{/stages/stage/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c991465{/stages/pool,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fef2aac{/stages/pool/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7f973a14{/storage,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@76130a29{/storage/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@124d02b2{/storage/rdd,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3451f01d{/storage/rdd/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@72503b19{/environment,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1cfc2538{/environment/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66deec87{/executors,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5d342959{/executors/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2ecf5915{/executors/threadDump,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@53e76c11{/executors/threadDump/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@42cc183e{/static,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e4e8fdf{/,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6a1d6ef2{/api,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2721044{/jobs/job/kill,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@205df5dc{/stages/stage/kill,null,AVAILABLE,@Spark}
> 17/08/03 15:18:18 INFO server.ServerConnector: Started Spark@3ba348ca{HTTP/1.1}{0.0.0.0:4041}
> 17/08/03 15:18:18 INFO server.Server: Started @5724ms
> 17/08/03 15:18:18 INFO util.Utils: Successfully started service 'SparkUI' on port 4041.
> 17/08/03 15:18:18 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.7:4041
> 17/08/03 15:18:18 INFO spark.SparkContext: Added JAR file:/Users/sathishjayaraman/java_projects/beamexample/word-count-beam/target/word-count-beam-0.1-shaded.jar
at spark://192.168.0.7:51207/jars/word-count-beam-0.1-shaded.jar with timestamp 1501753698326
> 17/08/03 15:18:18 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
> 17/08/03 15:18:18 INFO yarn.Client: Requesting a new application from cluster with 1
NodeManagers
> 17/08/03 15:18:18 INFO yarn.Client: Verifying our application has not requested more
than the maximum memory capability of the cluster (8192 MB per container)
> 17/08/03 15:18:18 INFO yarn.Client: Will allocate AM container, with 896 MB memory including
384 MB overhead
> 17/08/03 15:18:18 INFO yarn.Client: Setting up container launch context for our AM
> 17/08/03 15:18:18 INFO yarn.Client: Setting up the launch environment for our AM container
> 17/08/03 15:18:18 INFO yarn.Client: Preparing resources for our AM container
> 17/08/03 15:18:20 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is
set, falling back to uploading libraries under SPARK_HOME.
> 17/08/03 15:18:23 INFO yarn.Client: Uploading resource file:/private/var/folders/d3/d1mrkc4s023d3qv1jr6w4cg00000gp/T/spark-1c8deda6-98af-4ac3-8719-14ca7c90ddfc/__spark_libs__5010012016677506288.zip
-> hdfs://localhost:9000/user/sathishjayaraman/.sparkStaging/application_1501749993704_0001/__spark_libs__5010012016677506288.zip
> 17/08/03 15:18:25 INFO yarn.Client: Uploading resource file:/private/var/folders/d3/d1mrkc4s023d3qv1jr6w4cg00000gp/T/spark-1c8deda6-98af-4ac3-8719-14ca7c90ddfc/__spark_conf__8657584076288234522.zip
-> hdfs://localhost:9000/user/sathishjayaraman/.sparkStaging/application_1501749993704_0001/__spark_conf__.zip
> 17/08/03 15:18:25 INFO spark.SecurityManager: Changing view acls to: sathishjayaraman
> 17/08/03 15:18:25 INFO spark.SecurityManager: Changing modify acls to: sathishjayaraman
> 17/08/03 15:18:25 INFO spark.SecurityManager: Changing view acls groups to: 
> 17/08/03 15:18:25 INFO spark.SecurityManager: Changing modify acls groups to: 
> 17/08/03 15:18:25 INFO spark.SecurityManager: SecurityManager: authentication disabled;
ui acls disabled; users  with view permissions: Set(sathishjayaraman); groups with view permissions:
Set(); users  with modify permissions: Set(sathishjayaraman); groups with modify permissions:
Set()
> 17/08/03 15:18:25 INFO yarn.Client: Submitting application application_1501749993704_0001
to ResourceManager
> 17/08/03 15:18:26 INFO impl.YarnClientImpl: Submitted application application_1501749993704_0001
> 17/08/03 15:18:26 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services
with app application_1501749993704_0001 and attemptId None
> 17/08/03 15:18:27 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:27 INFO yarn.Client: 
> 	 client token: N/A
> 	 diagnostics: AM container is launched, waiting for AM container to Register with RM
> 	 ApplicationMaster host: N/A
> 	 ApplicationMaster RPC port: -1
> 	 queue: default
> 	 start time: 1501753705959
> 	 final status: UNDEFINED
> 	 tracking URL: http://Quartics-MacBook-Pro.local:8088/proxy/application_1501749993704_0001/
> 	 user: sathishjayaraman
> 17/08/03 15:18:28 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:29 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:30 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:31 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:32 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:33 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:34 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:35 INFO yarn.Client: Application report for application_1501749993704_0001
(state: ACCEPTED)
> 17/08/03 15:18:35 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster
registered as NettyRpcEndpointRef(null)
> 17/08/03 15:18:35 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter,
Map(PROXY_HOSTS -> Quartics-MacBook-Pro.local, PROXY_URI_BASES -> http://Quartics-MacBook-Pro.local:8088/proxy/application_1501749993704_0001),
/proxy/application_1501749993704_0001
> 17/08/03 15:18:35 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
> 17/08/03 15:18:36 INFO yarn.Client: Application report for application_1501749993704_0001
(state: RUNNING)
> 17/08/03 15:18:36 INFO yarn.Client: 
> 	 client token: N/A
> 	 diagnostics: N/A
> 	 ApplicationMaster host: 192.168.0.7
> 	 ApplicationMaster RPC port: 0
> 	 queue: default
> 	 start time: 1501753705959
> 	 final status: UNDEFINED
> 	 tracking URL: http://Quartics-MacBook-Pro.local:8088/proxy/application_1501749993704_0001/
> 	 user: sathishjayaraman
> 17/08/03 15:18:36 INFO cluster.YarnClientSchedulerBackend: Application application_1501749993704_0001
has started running.
> 17/08/03 15:18:36 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService'
on port 51221.
> 17/08/03 15:18:36 INFO netty.NettyBlockTransferService: Server created on 192.168.0.7:51221
> 17/08/03 15:18:36 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy
for block replication policy
> 17/08/03 15:18:36 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver,
192.168.0.7, 51221, None)
> 17/08/03 15:18:36 INFO storage.BlockManagerMasterEndpoint: Registering block manager
192.168.0.7:51221 with 366.3 MB RAM, BlockManagerId(driver, 192.168.0.7, 51221, None)
> 17/08/03 15:18:36 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver,
192.168.0.7, 51221, None)
> 17/08/03 15:18:36 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver,
192.168.0.7, 51221, None)
> 17/08/03 15:18:36 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@350bbd5d{/metrics/json,null,AVAILABLE,@Spark}
> 17/08/03 15:18:41 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor
NettyRpcEndpointRef(null) (192.168.0.7:51226) with ID 1
> 17/08/03 15:18:41 INFO storage.BlockManagerMasterEndpoint: Registering block manager
192.168.0.7:51229 with 912.3 MB RAM, BlockManagerId(1, 192.168.0.7, 51229, None)
> 17/08/03 15:18:42 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor
NettyRpcEndpointRef(null) (192.168.0.7:51228) with ID 2
> 17/08/03 15:18:42 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready
for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
> 17/08/03 15:18:42 INFO storage.BlockManagerMasterEndpoint: Registering block manager
192.168.0.7:51230 with 912.3 MB RAM, BlockManagerId(2, 192.168.0.7, 51230, None)
> 17/08/03 15:18:42 INFO Configuration.deprecation: dfs.data.dir is deprecated. Instead,
use dfs.datanode.data.dir
> 17/08/03 15:18:42 INFO Configuration.deprecation: dfs.name.dir is deprecated. Instead,
use dfs.namenode.name.dir
> 17/08/03 15:18:42 INFO spark.SparkRunner$Evaluator: Entering directly-translatable composite
transform: 'WordCount.CountWords/Count.PerElement/Combine.perKey(Count)'
> 17/08/03 15:18:42 INFO spark.SparkRunner$Evaluator: Entering directly-translatable composite
transform: 'WriteCounts/WriteFiles/View.AsIterable'
> 17/08/03 15:18:42 INFO spark.SparkRunner$Evaluator: Entering directly-translatable composite
transform: 'WriteCounts/WriteFiles/Create.Values'
> 17/08/03 15:18:42 INFO metrics.MetricsAccumulator: Instantiated metrics accumulator:
org.apache.beam.runners.core.metrics.MetricsContainerStepMap@2b1a901d
> 17/08/03 15:18:42 INFO aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:

> 17/08/03 15:18:42 INFO spark.SparkRunner$Evaluator: Evaluating Read(CompressedSource)
> 17/08/03 15:18:42 INFO spark.SparkRunner$Evaluator: Evaluating ParMultiDo(ExtractWords)
> {code}
> *YARN Job log*
> {code}
> 17/08/03 13:00:33 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8030
> 17/08/03 13:00:33 INFO yarn.YarnRMClient: Registering the ApplicationMaster
> 17/08/03 13:00:34 INFO yarn.YarnAllocator: Will request 2 executor container(s), each
with 1 core(s) and 2432 MB memory (including 384 MB of overhead)
> 17/08/03 13:00:34 INFO yarn.YarnAllocator: Submitted 2 unlocalized container requests.
> 17/08/03 13:00:34 INFO yarn.ApplicationMaster: Started progress reporter thread with
(heartbeat : 3000, initial allocation : 200) intervals
> 17/08/03 13:00:35 INFO impl.AMRMClientImpl: Received new token for : 192.168.0.7:50173
> 17/08/03 13:00:35 INFO yarn.YarnAllocator: Launching container container_1501744514957_0003_01_000002
on host 192.168.0.7
> 17/08/03 13:00:35 INFO yarn.YarnAllocator: Received 1 containers from YARN, launching
executors on 1 of them.
> 17/08/03 13:00:35 INFO impl.ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies
: 0
> 17/08/03 13:00:35 INFO impl.ContainerManagementProtocolProxy: Opening proxy : 192.168.0.7:50173
> 17/08/03 13:00:37 INFO yarn.YarnAllocator: Launching container container_1501744514957_0003_01_000003
on host 192.168.0.7
> 17/08/03 13:00:37 INFO yarn.YarnAllocator: Received 1 containers from YARN, launching
executors on 1 of them.
> 17/08/03 13:00:37 INFO impl.ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies
: 0
> 17/08/03 13:00:37 INFO impl.ContainerManagementProtocolProxy: Opening proxy : 192.168.0.7:50173
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message